• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Do we have way more computer power than we need?

MildredHubble

Well-Known Member
V.I.P Member
Having been messing around with some quite old iMacs (actually probably considered ancient at 12-14 years old), I've noticed that they just don't feel old or cumbersome to use.

I upgraded the i3 CPU in my 27" iMac to an i5 and it's definitely a bit quicker, enough that you notice. I was happy enough with the i3 but it was a bit slower compared to my MacBook at converting video files. They were nothing fancy, no 4K as I don't care about it much, just 720p and 1080p. It can blitz through an episode of the Simpsons in about 10-15 minutes. Well, I consider that "blitzing" lol!

My 21" 2012 iMac is slightly newer with a 4th Generation i5 and that is quicker still. It's running the second latest MacOS and should run the latest when I feel like trying it out.

The thing is, all of my old intel Macs run great. They do everything I need them too and I doubt I'm really pushing them. They run newer MacOS versions than they should be able to with the help of OpenCore Legacy Patcher. I doubt most people would notice the difference between my Macs and much newer ones!

So seeing these things run so well I wondered if anyone else thinks we are barely using the processing power available to us, particularly in cutting edge machines? Do you even think we will ever have a use case for all those clock cycles and processor cores? People also seem to have huge amounts of memory installed, is it really necessary? Should we slow things down a bit and really squeeze every last bit of power out of these machines before we spend huge amounts of money on machines that could run half of CERN and still have enough power left over to surf the internet and watch cat videos on YouTube in 8K? :)

I have one of my favourite machines on my desk, the basic design of which came out almost exactly 40 years ago! I use it to keep notes or to just mess around with Demos and occasionally games. It takes less than 7 seconds to boot into a fairly modern looking OS (SymbOS) and go through and edit my lists and then a split second later I can shut it down and switch it off. No hanging around while it shuts down. I can still get things done with that machine :)
 
The increased hardware specs seems function to accommodate new software bloat in practice. It allows for poorer quality software to be written, but that also means that a lot of new software can be written by allowing more people to be programmers.
 
So seeing these things run so well I wondered if anyone else thinks we are barely using the processing power available to us, particularly in cutting edge machines?
We use it all right, but by wasting it. The Windows desktop itself sucks more resources than many video games, people notice that real quick when they start playing around on Linux. Especially so if they're dual booting both systems on the same machine, games play so much better because they don't have to try and compete with an overbloated desktop for resources.

I saw a demonstration competition one day between a 1996 computer running Win95 and a 2015 computer running Win7. They had to boot the computer, open Office and create a document, then send that document to the printer. The Win95 machine won it by several minutes.

The Mac desktop is pretty resource hungry too.
 
For years I lusted after a computer powerful enough for CAD. Now, I have far more capacity than I even dreamed of then, but most of it, and my 'net traffic, is to support advertising, and keep it from slowing the desired content down too badly. Overall, I think it would be cheaper, and far better for our attention spans, if we paid for content either as included in a data plan or by micropayments.
The things I usually use don't need anything faster than 30 year old hardware. The software writers have gone mad for features I don't want to even know about, while often missing the ones I'd have written, and completely careless about use of resources. Half the time, a Tandy 200 would be plenty.
 
I saw a demonstration competition one day between a 1996 computer running Win95 and a 2015 computer running Win7. They had to boot the computer, open Office and create a document, then send that document to the printer. The Win95 machine won it by several minutes.
I sometimes wonder if they just spent the last 30 years just refining Windows 95 would things be better? I suppose people would argue that is exactly what Windows 10 was and 11. Maybe it would have been worse, but I suspect that there's loads of data mining services taking up resources and silly things that are mostly cosmetic that nobody notices.

I do think that the MacOS has a few things just taking up resources for no good reason. This like flashy blurs on foreground windows, it just seems a bit unnecessary particularly when it's doing it in realtime. I guarantee you there's a really quick and simply way to get the same effect. I have always liked the "Genie Effect" when minimising windows but that was possible on a PowerMac G3! Probably G3 iMacs too. It barely used any resources either!

The things I usually use don't need anything faster than 30 year old hardware. The software writers have gone mad for features I don't want to even know about, while often missing the ones I'd have written, and completely careless about use of resources. Half the time, a Tandy 200 would be plenty.

I find this is basically true of a lot of things I do with computers. The most intensive thing I've done is to use music production software, but I've rarely pushed it to the point where it stutters. Last time that happened I was using a machine with a Core2Duo CPU and that was an unusually complex project! In almost every case, the performance metre is blank or at worst shows maybe 5% CPU usage! I haven't bothered buying the new version of the software as I can't imagine any possibility that I wouldn't be able to easily achieve my aims with the resources I have.

I see a lot of people buying expensive "plugins" for their music software and I've never needed to because I've learned how to combine the things built in to produce the exact same effect :)
 
I like to think I have just what I need and no more, and no less. Using an Intel i5 12400F CPU, with moderate multi-threading capabilities sufficient for minor video production and 3D rendering while more than capable of playing a number of games intended to run at 1080p.

A cpu that defaults to 65 watts, but is capable of going up to 117 watts in turbo mode. So it isn't perpetually cranking out unnecessary heat. Most important in building a new computer inside an old case...that can handle such thermodynamics.

Intended for what I normally do besides surf the web, to create and alter digital images using Photoshop.
 
Intended for what I normally do besides surf the web, to create and alter digital images using Photoshop.
Some of those Photoshop effects can use a lot of CPU power, well, at least if you prefer not to wait around all day for a CPU like the ones I have in my machines. Having said that, it's one thing to be good with Photoshop and really make use of it and entirely another for some other people who think they need a powerful computer just to hit the "Auto Enhance" button in the "Photos" app.

So far I have not really done anything that would even mildly tax a 486 when editing photos on my computer. Maybe that will change in the future.
 
Some of those Photoshop effects can use a lot of CPU power, well, at least if you prefer not to wait around all day for a CPU like the ones I have in my machines.

Oh how I know. ;)

On my legacy computer I had the same version of Photoshop, running on Windows XP with 1.5Gb of memory. Some 23 years ago most of the web work I did was at 72 DPI. These days most of it is done at 300DPI. Once in a while I used to use my legacy system, but making and waiting for graphics to be rendered that size is not fun.

Takes an eternity compared even to my old i5 3570K system with 16Gb of memory.
 
What a loaded and multi-faceted question. It's highly dependent on the application in question. Then, you get into how engineering waste and sloppiness "keeps pace" with the technology, so you end up paying for more and more computing power just so that the corporate dev team can squander it with increasingly and totally careless engineering practices.

There's also the issue of how we accept sales pitches to technologize even things that don't need to be, and worse, some things that probably suffer for being computerized, like for reliability, or privacy.
 
What a loaded and multi-faceted question. It's highly dependent on the application in question. Then, you get into how engineering waste and sloppiness "keeps pace" with the technology, so you end up paying for more and more computing power just so that the corporate dev team can squander it with increasingly and totally careless engineering practices.

There's also the issue of how we accept sales pitches to technologize even things that don't need to be, and worse, some things that probably suffer for being computerized, like for reliability, or privacy.
Yeah :) It was deliberate to hopefully encourage people to think about what they prioritise on their computers and consider if they are a bit overspecced while still being "old-fashioned" perhaps.

I've been wondering about this subject for around a decade. There's a video about basically this subject by the8bitGuy on YouTube. I've felt for a long time that the basic administrative tasks eg. Navigation of the file system, opening applications, browsing emails, hasn't taxed the CPU etc in a very long time.

I think for more basic tasks we really haven't needed more than a Core2Duo, but as you say, eventually the resources are squandered by poor coding and corporate bottom lines.

About 10 years ago at work, a colleague asked me, or we were talking about computers and he asked what CPU I had. I can't remember which I had but it was either a 3.4GHz Xeon or a 3 GHz Core2Quad. He commented that it was still a decent CPU, to which I replied "Does everything I need it to..." before the usual smug twerp I worked with butted in as sneered sarcastically "Hummmm most things...." I don't know how he became the authority on how useful I find my own computer but that was that lol!

Thing is I was still using that same machine until 2-3 years ago and I only stopped using it because installing/hacking the MacOS had become a bit or a chore. It wasn't impossible, but it was more effort than I was prepared to put in at the time. I used it for games though through windows and it ran the latest AAA titles without any issues, though the fan got a little noisy at times.

The irony is that now it has become trivial again to install the latest MacOS on Core2 based systems so I could probably use the system again if I want to. It's just sitting on a shelf because I can't throw potentially useful things away :)
 
Yeah :) It was deliberate to hopefully encourage people to think about what they prioritise on their computers and consider if they are a bit overspecced while still being "old-fashioned" perhaps.

I've been wondering about this subject for around a decade. There's a video about basically this subject by the8bitGuy on YouTube. I've felt for a long time that the basic administrative tasks eg. Navigation of the file system, opening applications, browsing emails, hasn't taxed the CPU etc in a very long time.

I think for more basic tasks we really haven't needed more than a Core2Duo, but as you say, eventually the resources are squandered by poor coding and corporate bottom lines.

About 10 years ago at work, a colleague asked me, or we were talking about computers and he asked what CPU I had. I can't remember which I had but it was either a 3.4GHz Xeon or a 3 GHz Core2Quad. He commented that it was still a decent CPU, to which I replied "Does everything I need it to..." before the usual smug twerp I worked with butted in as sneered sarcastically "Hummmm most things...." I don't know how he became the authority on how useful I find my own computer but that was that lol!

Thing is I was still using that same machine until 2-3 years ago and I only stopped using it because installing/hacking the MacOS had become a bit or a chore. It wasn't impossible, but it was more effort than I was prepared to put in at the time. I used it for games though through windows and it ran the latest AAA titles without any issues, though the fan got a little noisy at times.

The irony is that now it has become trivial again to install the latest MacOS on Core2 based systems so I could probably use the system again if I want to. It's just sitting on a shelf because I can't throw potentially useful things away :)
If you want to play the latest games, then you will want a screaming hotrod of a computer, and they are out there, and they are expensive. They come loaded with a GPU which is designed to do specific kinds of mathematical operations in great bulk at tremendous speed.

Now, if you want to use your computer as an Internet terminal, you're right. You can look at cell phone hardware to recognize that if you could plug in a keyboard and monitor, then that cell phone is plenty for that purpose. Given your interest in simplicity and minimalism, I suggest you look into a Linux distribution like Ubuntu. Linux is actually what Android is based on, and it's very, very efficient and minimalist. The usual complaint is the learning curve, but a desktop-style distribution like Ubuntu is targeted at more casual users for replacing Windows (or MacOS, etc.)
 
Running open source AI models, so happy in the knowledge that I'm squeezing every drop of juice from my compy.
 
Definitely depends on what you're doing with it.

Like, I've got an Intel i9-12900k and a RTX 3090 and whatnot, but almost none of that is necessary for all the gaming I do... this is mostly for fractal rendering (and any other rendering, at some point I want to learn things like Blender and whatnot).

And fractals are so demanding and difficult to render that even on this machine, the process of doing it brings the machine to its knees, so to speak. Granted this depends on which type of fractal I'm doing. The 2D ones, not so much. The 3D ones, well, freaking NASA might have some trouble with those. Particularly if I'm doing a deep render (the deeper into a fractal you are, the more draining it is to do, and sooner or later you will hit a point where you dont have enough power to go further in). Typically I try not to touch the thing whenever it's in the middle of a render, lest it explode. You can tell when it's struggling when the lights on the keyboard start flipping out.

And honestly what I got STILL doesnt feel like enough. Some renders still take forever to do. And I dont even want to think of how long a full animation takes to make. At some point I might upgrade to a 4090 or whatever for that.

Though it's a far cry better than my previous machine. A render that took like 8 hours on that thing can be done in like 10 minutes on this one.

If you want to play the latest games, then you will want a screaming hotrod of a computer, and they are out there, and they are expensive.

You dont need a super high end machine for gaming these days, you really dont. Even with AAA games. I know they want you to THINK you do, but you dont. Even a "low end" modern PC can handle by far the majority of games. There are very, VERY rare exceptions, but mostly it's unnecessary.

Companies get more sales by having the games be more accessible to the consumer, and if a superpowered machine is *required* for a game, it aint gonna do so well in sales, simply because most people dont have machines like that. On top of that, they need most such games to also be playable on consoles. So they aint gonna push the requirements too far, and for most games the PC versions have about a bazillion options for adjusting the graphics; dont have the power, just lower the settings. That's also why optimization is so very important to developers.

Heck my heavily damaged laptop could run pretty much anything, and it wasnt exactly powerful to begin with since it was just meant for on-the-go use (and it's old). Though I dont use that thing anymore, not since the screen got damaged.


Now granted, this all changes depending on HOW FAR you want to go. Want to play at crazy high settings on a 4K monitor? Okay, NOW you want all that extra power....

....But do you NEED it? I often find myself questioning that, when it comes to things like 4K monitors. I use a simple 1080p display because I'm too lazy to upgrade to a better monitor (particularly since the monitor has no bearing whatsoever on the fractal rendering, the image size/resolution has nothing to do with the monitor), but also because I dont see the point of going further. Like, yay, let's pay 5 bazillion dollars for a nearly imperceptible resolution increase. Well, imperceptible to me anyway.
 
Heck my heavily damaged laptop could run pretty much anything, and it wasnt exactly powerful to begin with since it was just meant for on-the-go use (and it's old). Though I dont use that thing anymore, not since the screen got damaged.
Sounds like my Core2Quad or possibly Xeon CPU running Call of Duty, it was the 80s themed one, can't remember what it's called now Cold War? I only played it past the first mission. I couldn't deal with the violence, something that I seem to have become a lot more sensitive to the last few years. But yeah, I was getting some decent frame rates out of it. Can't remember my GPU, it's an AMD RX400 or 500 or something or other.

I suppose it makes sense as COD is hyper popular, so they will optimise it to run on a potato-matic machine. Mine was managing with some of the setting turned up a good bit. I think the shadows were the things that made it stutter.

But it's cool you are using all those clock cycles for something constructive. I'm curious actually, when the fractal has rendered, do you get to keep them and view them from any point?
 
If you want to play the latest games, then you will want a screaming hotrod of a computer,
Not true at all.

All you need to play video games is a decent graphics card. Database manipulation and video editing are a couple of things that consume a lot of processor power but I'm playing the latest and greatest video games on a Core i3.
 
I'm curious actually, when the fractal has rendered, do you get to keep them and view them from any point?

Yeah, the actual fractal object is separate from the final render, kinda like how it goes with something like Blender.

Like, I've got this final product here, that I'd done quite awhile ago:

youstupidthing.jpg


But I still have the files, and in the editor it looks like this:

editor.jpg


It's a full 3D model that can be manipulated however you want or viewed from any angle.

I went and did some changes just now and...

VioletMess6.jpg


I'm not sure WHY I did that, but I did it.

It's technically the same object, but I upped the complexity and changed a few parameters until it did... that. Add a horrible background because I apparently dont have good ones anywhere, move the camera till I get a good angle, and there ya go.

Whole PC nearly died doing whatever that is (the final image resolution is MUCH bigger than the one I'm showing here). I think that would have been entirely beyond my previous PC altogether. This is full raytracing on an object with a metallic/reflective surface, and all that fun stuff here.
 
Like, I've got this final product here, that I'd done quite awhile ago:

youstupidthing.jpg
This one kinda reminds me of the OG Xbox Flubber in the boot sequence, but waaaay more complex and also made by the Borg from Star Trek!

Those clock cycles are going to great use they look amazing! :)
 

New Threads

Top Bottom