Tagged: GPU Toggle Comment Threads | Keyboard Shortcuts

  • isvarahparamahkrsnah

    isvarahparamahkrsnah 6:57 pm on March 2, 2021 Permalink | Reply
    Tags: , amdgpu_bl0, amdgpu_bl1, backlight brightness, , , , drivers, GPU, , , kernel, , Linus, Linus Torvalds, , linux drivers, nouveau, , nvidia drivers, NVIDIA X SERVER, NVIDIA X SERVER Settings, , screen brightness, ,   

    AMD & NVIDIA: The Enemies Of Linux 

    I just can’t get a break from these dumbass computer hardware manufacturers and their stupid fuckups.

    One of the reasons I wanted a new laptop was to be able to run almost any Linux distro without any issues.
    But that doesn’t seem to be the case, does it?

    I’ve been waiting for several weeks for AMD to fix a backlight issue problem where the screen brightness is 100% on every boot.

    Here’s an example of the error:

    systemd-backlight@backlight:amdgpu_bl0.service – Load/Save Screen Backlight Brightness of backlight:amdgpu_bl0
    Loaded: loaded (/usr/lib/systemd/system/systemd-backlight@.service; static)
    Active: failed (Result: exit-code) since Tue 2021-03-02 05:41:00 EEST; 5min ago
    Docs: man:systemd-backlight@.service(8)
    Process: 331 ExecStart=/usr/lib/systemd/systemd-backlight load backlight:amdgpu_bl0 (code=exited, status=1/FAILURE)
    Main PID: 331 (code=exited, status=1/FAILURE)

    Now this problem has been mentioned on several sites – on the kernel site, on AMD’s site, on systemd’s site, and on several distro forums.
    Nobody seems to be doing anything to fix this big fucking problem.

    So my question is, how long will I have to wait for some googley-eyed nerd to sort out whatever’s causing the problem?

    I don’t wanna be blinded by my screen when I turn the laptop on every dawn.
    Every week I checked the site for any updates, and all I see are more users having similar problems.

    So this is my understanding on the issue: Nobody is doing anything to fix a simple backlight issue. Several users have come up with their own hacks but I’m not looking for hacks. I want a perfectly running Linux OS when I do my upgrades.

    I’m not a patient man. Never was.

    Here’s my advice to anyone looking to purchase a laptop and run Linux on it – go for an Intel chip. Fuck it. But a laptop cooler or a mini fridge to put your laptop in, while you’re using it. At least you won’t be blinded by the fucking screen every time you boot or wake the damn thing up.
    Since both AMD and NVIDIA want to fuck around with the Linux users, fuck them both! Intel’s dropping their own GPUs soon. So wait for them and get an Intel CPU with an Intel GPU! How about that, eh?

    I want all the nerds from all sides to get together – have a meeting, and figure out what’s causing this fucking problem and fix it.

    Imagine my horror when I’d just got a brand new fucking laptop, installed my OS in it, upgraded the kernel and wallah! Blinding screen on every boot.

    This problem will have to be fixed. I’m stuck with this machine for the rest of the decade. I don’t want to endure a stupid fucking backlight brightness problem for 10 years.

    I’m raising awareness here.
    This is me, calling out all the nerds and stating the problem.
    This isn’t me, coming out, guns blazing, blowing craters up your asses.
    But my next article, will be a volcanic eruption. Linus Torvalds – that’s your cue.

    While I’m at it, here’s another problem: Why am I not seeing higher boost clock speeds on Linux? Is it AMD’s fault or the Linux kernel’s?
    I haven’t looked into it yet. But before I get down to it, I suggest ya’ll look into it first.

    Now, onto NVIDIA.

    Everyone says AMD GPUs just work on Linux.
    So why don’t NVIDIA’s?
    Why doesn’t the NVIDIA X SERVER Settings work?
    Now my laptop has a dedicated GPU that does fuck-all when running Linux.

    The nouveau nerds are saying NVIDIA has closed sourced their graphics cards. That’s why I’m unable to make use of them.

    Here’s my message to NVIDIA: NVIDIA you chinga a tu madre fuck you motherfuckers hija de su pinche, perra, aguanga, desgraciada madre!
    Open source your goddamn drivers and make sure they’re working like a charm on every goddamn Linux distro on this planet.
    Until that happens, I’m going to be writing an article every month, reminding the entire world just how much NVIDIA SUCKS.

    Linus, are you taking notes?

    AMD, lower your goddamn prices. Just because you’re making better chips than Intel doesn’t mean you should start raping your customers’ wallets.
    If AMD’s 6000 series chips aren’t priced appropriately, I’ll be posting a dozen articles discussing why AMD’s chips aren’t worth the prices, but not without a detailed description of Lisa Su’s adventures with a chimpanzee.

    Linus, the backlight.

     
  • isvarahparamahkrsnah

    isvarahparamahkrsnah 4:31 am on February 27, 2021 Permalink | Reply
    Tags: , cooling, , GPU, , processors   

    Laptop Standards: CPU 

    In the previous article, I rambled about displays and GPUs.

    I’d like to talk about CPUs briefly.

    If you’re getting a new laptop, go for a hexa core or octa core.

    The current standard on laptops is quad core. Most of the laptops you’ll see around have a minimum of 4 cores.
    But this bar has already been raised.
    By the end of this year, 6 cores and 8 cores will be the new baseline.

    If you’re making a purchase decision in 2021, let 6 cores be your baseline, if you want to be future-proof.

    Dual cores are dead and gone.
    If your laptop has a dual core, you’re living in the past.

    Quad core is being phased out by 6 core and 8 core chips.
    Every new laptop will now have 6 cores or 8 cores.

    Hyper-Threading is a thing.
    It was a thing when I got my first laptop, and it’s still thing today.
    Always select a CPU that supports hyper-threading. So a 6 core processor will have 12 threads and an 8 core will have 16.
    It’s better and faster than a 6 core with 6 threads or an 8 core with 8 threads.

    Always choose a higher TDP. It means you’ll have more power to run your chips at higher clock speeds.
    15W TDP is a damn shame. Always go for a 35W or 45W CPU.
    Now all this info is only for CPUs that were made in the past 4 years or so.
    And I’m making recommendations based on AMD’s site, because it’s simple and they make good valued CPUs.

    When I looked at Intel’s website, I was bewildered by the sheer number of processors they had. It left me confused and thought about seppuku.
    When it comes to Intel, you never really know which processor is the best because there is no simply and easy way to find out.
    And I guess this has been Intel’s way of conning people and robbing them of their money.

    If you’re getting a laptop with a CPU only, then one fan will suffice.
    But if you’re getting a laptop with a dedicate GPU, you’ll need at least two fans to provide an optimal cooling solution.

    Most laptops with dedicated GPUs now have 2 cooling fans, while some new ones have 3.
    If your laptop has only one fan with one or two cooling pipes, chances are, it’ll overheat and you’ll experience thermal throttling.

    Intel’s chips are said to run hotter than AMD’s. So unless you live in Siberia, you might wanna go with AMD.

     
  • isvarahparamahkrsnah

    isvarahparamahkrsnah 11:12 am on February 2, 2021 Permalink | Reply
    Tags: , , Apple M1, , GPU, Graphics cards, , , , M1, , , , SoC,   

    Apple M1: The Greatest Chip Of The Decade 

    The M1 is an ARM based SoC designed by Apple inspired by their ARM A14 chip.

    It has four high-performance Firestorm and four energy-efficient Icestorm cores.

    The Firestorm cores have 192 KB of L1 instruction cache and 128 KB of L1 data cache and share a 12 MB L2 cache.

    The Icestorm cores have a 128 KB L1 instruction cache, 64 KB L1 data cache, and a shared 4 MB L2 cache.

    The Icestorm “E cluster” has a frequency of 0.6–2.064 GHz and a maximum power consumption of 1.3 W.

    The Firestorm “P cluster” has a frequency of 0.6–3.204 GHz and a maximum power consumption of 13.8 W.

    The M1 comes with unified 8 GB or 16 GB RAM. The RAM and the SoC are combined into a single chip.

    The M1 has an integrated Apple-designed 7-core/8-core GPU.

    And that’s the specs sheet.

    This chip is faster than any Intel and AMD chip of it’s class.
    In fact, this chip is in it’s own class with it’s 5 nm architecture.

    This chip is also faster than any NVIDIA and AMD dedicated graphics card in a special way.

    The M1 can consume a maximum of 14W.
    In fact the M1 Macbooks don’t even come with a fan! My crappy Lenovo S20-30 netbook also came without a fan but it had a crappy slow piece of Intel garbage.

    The battery life for M1 powered laptops is the outstanding.

    The M1 Macbook Air was reportedly launched at the price of $999.
    At this price point, I’d say this was an acceptable deal, BUT the Macbooks aren’t great.
    Don’t get me wrong – the M1 chip is the greatest, so far. If I had the money, I’d buy a laptop with an M1 chip.
    But Apple Macbooks aren’t a great purchase for consumers. And for several reasons; one being, very poor maintenance, repair and upgradability options.

    The M1 is a trendsetter. It’s a spaceship in a world of supercars.

    I’m excited to see what the competition will now bring on the table.
    It’ll take AMD perhaps another 5 years to get their equivalent of an M1 chip.
    It’ll take Intel 10 years or more. Because Intel sucks and FUCK INTEL!

    Everyone was talkin’ about how cool AMD chips are. I mean, look at this! It’s more powerful than anything AMD came up wih and doesn’t even need a fan! If AMD is cool, then Apple’s M1 is ice-cold. And Intel – oh bugger! Intel may shut down by the end of this decade. Their CPUs are way overpriced and Intel HD Graphics is completely useless when the M1 is taken into consideration. And the Iris crap isn’t any better. No praise for Intel and their new generations of the same overpriced bullshit every year!

    I mean, look at the performance vs power consumption here, both on the CPU and integrated GPU.
    The Zen 3 products are the last great thing from AMD for the next few years. Because now they’ll have to go back to the drawing board, and redesign an entire new SoC that can compete with Apple’s M1. And compatible graphics cards that can go along with those.
    Because that’s where the money is.

    The M1 chip just falls short of AMD and Nvidia’s most powerful dedicated GPUs that were released late last year. That’s a big deal. Apple singlehandedly designed a chip with an integrated GPU that beat all the dedicated GPUs before it’s launch.

    Nvidia will have to go back to the drawing board as well. Every company that wants to survive will have to do something! And if they don’t, rest assured that Apple will gain a whole lotta new consumers. Well that can’t be good for AMD, Intel or Nvidia, right?

    Fanless SoC that runs like a beast! The entire motherboard is so small, they make desktop computers look like dinosaurs.

    Now I don’t like Apple’s tiny motherboards and terrible upgrading designs. I hope the other companies would make better choices.
    Imagine a 16 core M3 chip, with a fan and the same chassis as today’s average laptops. The thing would be a beast with the best cooling possible.
    And I hope there would be upgrading options with RAM slots in addition to the integrated RAM. That’d be nice.

    I think this new decade will bring along some of the most powerful chips we could ever have imagined. And the longer those are delayed, the larger Apple’s customer base will grow.

    Apple’s 8 core M1 chip is already here.
    Now we just have to wait and see when they’ll release a 12 core M2 and 16 core M3..
    By the time those come out, if AMD and Nvidia haven’t released something as good, then dedicated graphics cards will become obsolete.
    I mean, if there was an option for the M1 chip in a Lenovo, Asus or Dell laptop when the M1 Macbooks came out, I’d have bought one.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel