17 Jan, 2009, elanthis wrote in the 61st comment:
Votes: 0
Samson said:
1GB taken up to idle at your desktop is insane.


Yes, it is. Good thing that no OS actually does that.

I mean, Linux will use up all 4GB in my desktop, sure. But 3.5 GB of that is file caches and buffers, which are evicted if an actual application needs more memory. The OS (Linux, Windows, OS X, etc.) can use it to speed stuff up. Most memory reporting tools do a horrific job of reporting actual memory "pressure" vs total "used" memory, so avoid claims of Windows or whatever eating up 1GB on fresh boot. Especially Windows, which has its superfetch/readyboost/whatever that loads file caches of commonly used executables and files so that things run faster (which most Linux distros do too now, albeit far more half-assed and incomplete… par for the course).
17 Jan, 2009, quixadhal wrote in the 62nd comment:
Votes: 0
Just for amusement…. I'm running Windows XP, and currently using 524M out of 3G. I have swap disabled. Firefox is taking 178M with 4 tabs open, the web scanning component of Avast! anti-virus software clocks in second at 59M. The file explorer comes in at 33M, and svchost comes in at 24M below that.

svchost is the equivalent of inetd + init under linux, which take up 1.9M + 1.7M (3.6M). The equivalent of explorer.exe would be a graphical file browser, which I don't use, so I can't compare that. On the rare occasions I have a graphical desktop in linux, Firefox seems to be around the same size there. Of course, I also can't compare anti-virus software on linux.
17 Jan, 2009, David Haley wrote in the 63rd comment:
Votes: 0
elanthis said:
My Linux desktop has kernel-paniced twice in the last week, Firefox crashed on me this morning, Youtube videos really aren't playing sound (but everything else is), and I've spend at least 8 hours tracking down the cause of a Fedora mkinitrd bug over the last two months. That's just the recent Linux problems I've had.

Hrm. In the ~4 years I've been running Linux as a primary system, I think I've only had two unexplainable kernel panics. I did have a fairly nasty bout with random lockups and kernal panics, but that was due to a flaky RAM module.

Youtube not working was annoying for quite a while, but the 64 bit flash libs have (almost) fixed that entirely.
elanthis said:
There are add-ons for this. Sure, they're not built-in, but technically neither are virtual desktops built into Linux. ;)

Eh, if you consider Gnome or KDE to be an "addon", then sure… :wink: I think they count as part of the "Linux Desktop" for all intents and purposes.

quixadhal said:
Now, imagine how incredibly fast our modern hardware COULD be, it it weren't shackled with megabytes upon megabytes of layered OS libraries that wrap and encapsulate everything in so many ways that you couldn't even LIFT the API manual if a complete one existed (which it doesn't).

Or, from another perspective, it is these very layers of abstraction and encapsulation that have allowed the proliferation of software (good and bad, of course).

There's a big difference between true bloat (get away with inefficiencies simply because you have the hardware to do so) and useful abstraction; your above words make it seem like you're against all forms of abstraction in the first place.

quixadhal said:
On my C64, I didn't feel limited by the sound or graphics of the hardware… I felt that it was my own (lack of) skill that prevented me from getting something to look or sound the way I wanted them to. Now, I feel that my skill doesn't matter, because I'm expected to learn visual studio, and expected to program for the DirectX API, and expected to adapt myself to writing software that will "work with" everything else, even if it doesn't do what *I* want.

This is kind of like bemoaning the complexity jet planes because you need to be a fairly qualified engineer to really understand what's going on. Sure, it's nice to be able to grasp the whole system, but if we adopted this as a general principle of life we wouldn't have gotten very far into the modern world.
17 Jan, 2009, Fizban wrote in the 64th comment:
Votes: 0
Samson said:
Fizban said:
Samson said:
Opera could if it's interface wasn't such a nightmare of usability.


You still haven't used it in years have you? I really haven't found it to suffer from any usability issues in the last 5-6 releases.


Actually I just used it the other night to grab a couple of torrent files. The 9.6 version even. Nothing has changed. It's sill a nightmare of usability.


Then I suppose it's just a difference of opinion, I honestly find it to be quite user friendly, far moreso than internet explorer and slightly moreso than Chrome, and possibly slight less so than FF.
17 Jan, 2009, quixadhal wrote in the 65th comment:
Votes: 0
DavidHaley said:
There's a big difference between true bloat (get away with inefficiencies simply because you have the hardware to do so) and useful abstraction; your above words make it seem like you're against all forms of abstraction in the first place.


Nah, not all of them… but because the human race can't seem to accept that there IS no single tool or methodology that's the best for all purposes, we keep forcing the most popular ones down everyone's throats until something else becomes more popular. As a result, almost all the programming being done now is forcibly split into as many layers of abstraction as it can be, instead of as many as are really useful.

It's rather like the management chains of middling-sized companies during the dot-bomb era. Literally, you had a CEO with 5 vice presidents, and each VP had a director, that director had a manager, that manager had another manager, and the bottom manager had 2 or 3 people working for him. I'm not kidding! That's 8 people doing the work of 3 or 4. That's how I see abstraction in today's environment. Too many layers, not enough work getting done.

DavidHaley said:
This is kind of like bemoaning the complexity jet planes because you need to be a fairly qualified engineer to really understand what's going on. Sure, it's nice to be able to grasp the whole system, but if we adopted this as a general principle of life we wouldn't have gotten very far into the modern world.


There's a question of just how good or bad that fact may be. :)

However, setting that aside… we've replaced one type of reinventing the wheel for another. Instead of having everyone rewriting code that does almost the same thing, and wasting man-hours doing it, we instead have people spending man-hours having to learn many layers of API's to avoid wasting that same time writing code.

On the plus side, less often rewrites means fewer bugs. On the minus side, when a bug does get into at API layer, nobody fixes it because they're spending time working around it or switching to another layer.
17 Jan, 2009, David Haley wrote in the 66th comment:
Votes: 0
quixadhal said:
However, setting that aside… we've replaced one type of reinventing the wheel for another. Instead of having everyone rewriting code that does almost the same thing, and wasting man-hours doing it, we instead have people spending man-hours having to learn many layers of API's to avoid wasting that same time writing code.

But the time it takes to learn an API is significantly smaller than the time it takes to develop the expertise to write the API yourself. How many people here know how to write a TCP/IP stack? Compare with how many know how to send basic stuff around with sockets. And there you have it…

Similarly, it is a lot easier to learn to use OpenGL or DirectX than it is to learn how to write device drivers for all the stuff you need to talk to, and then write your own API on top of that.

Basically, the APIs are providing APIs that you are (or should be) writing yourself. Yes, it doesn't always mesh perfectly, but that is the price you pay. It is a very small price compared to the gains, no?

I have to admit that I'm a little surprised this is even debatable in the first place. Over time as a society we have developed specializations that let us go further as a group, because we can rely on knowing that other people will take care of their little bit of specialization. The computer engineer doesn't need to know how to build a jet engine, and vice versa, meaning that each can go further into their own specialization. Programming is no different, really. I don't want to have to worry about how to talk to the hardware: that has nothing to do with my job. Let the systems engineers take care of that, because it's what they're good at. Making everybody rewrite code all the time is far more expensive than learning an API.

After all, that's why good software engineering is modular: the whole point of it all is to avoid rewriting things all the time. When dealing with very large systems, it is far cheaper to poke at an interface that abstracts away nitty-gritty details than it is to reimplement that interface yourself. Again, any sufficiently large project with half-competent developers would end up writing a generic interface anyway, so it's not as if time is being saved on not having to write an API.
17 Jan, 2009, Tyche wrote in the 67th comment:
Votes: 0
quixadhal said:
svchost is the equivalent of inetd + init under linux, which take up 1.9M + 1.7M (3.6M).


svchost is a wrapper to run a dll as an exe. If you write a dll that's initialization is a loop, svchost will run in about 25K.
Most of the services are dlls and svchost is used to run them. I would venture that the 40M+ svhost running is the Desktop Manager.

Edit: 25M in your case, XP?
17 Jan, 2009, elanthis wrote in the 68th comment:
Votes: 0
DavidHaley said:
Basically, the APIs are providing APIs that you are (or should be) writing yourself. Yes, it doesn't always mesh perfectly, but that is the price you pay. It is a very small price compared to the gains, no?


It really depends. Yes, abstractions can be good. Sometimes you can over-abstract, or start writing APIs that solve problems that people aren't having. Linux lately has had a LOT of these kinds of things… look at the udev crap that is shoved down our throats now. It's a daemon and a configuration DSL for programmatically registering device nodes in a user-configurable way.

What does that mean? That means that you have to learn a whole ****ing custom mini-language (and related bits) to fix a device node bug, all because someone thought that someone might want to have /dev/sda named /dev/scsi/disc-a (or any other arbitrary name) someday… even though _nobody does this_ because it breaks existing applications and invalidates most sysadmin's experience and knowledge. In fact, there's even effort now to make all the Linux distros use a standardized set of udev rules instead of shipping their own (mostly compatible) home-grown ones in order to avoid the bits where the rules different from the "standard"!

A whole DSL, API, daemon, and tool chain to solve a problem that nobody is having, and in the end create a problem we weren't having before and make simple fixes or additions far, far more complex than they ever were before. Sure, the daemon was necessary to solve a real problem; but the people working on that project took it too far without really thinking through what they were doing or why they were doing it.

That isn't an isolated case. Linux, Windows, OS X… all OSes have plenty of examples of stuff like that. So do most applications, for that matter.

Abstraction for abstraction's sake is wrong. That is what differentiates good engineers (abstract where necessary only) from bad engineers (abstracts everything because their CS professor taught them to modularize all code behind a plethora of cookie-cutter design patterns) and non-engineers (don't know what "abstraction" means). :)
18 Jan, 2009, Zeno wrote in the 69th comment:
Votes: 0
Testing Blackbox: http://zeno.biyg.org/~zeno/Problem_20090...
View in IE
20 Jan, 2009, David Haley wrote in the 70th comment:
Votes: 0
elanthis said:
It really depends. Yes, abstractions can be good. Sometimes you can over-abstract, or start writing APIs that solve problems that people aren't having. Linux lately has had a LOT of these kinds of things… look at the udev crap that is shoved down our throats now

I think that your argument resembles the opposite one that "somebody" was making w.r.t. Linus's dislike of C++: just because some people do stupid things with something doesn't mean that that something is itself stupid. :wink:

Obviously some abstractions are annoying/stupid/useless/wasteful/etc. – I was however responding to the (apparent) claim that these layers serve no useful purpose in general.
20 Jan, 2009, Zeno wrote in the 71st comment:
Votes: 0
20 Jan, 2009, quixadhal wrote in the 72nd comment:
Votes: 0
I don't think anyone is claiming that abstration, in general, is worthless… however I do claim that it's over-hyped and thus over-used. Students today are taught to break everything down into class hierarchies, even when the problem is more easily solved without them, and that training carries through into the workplace.

If you're sitting with a problem and can't find a clean way to break it up into objects/classes/etc… maybe breaking it up isn't the right solution. Yet I've heard horror stories of teachers marking off for people who choose the simpler path, because they *could* have made it more object oriented.

All I know is, when Windows 95 came out, creating a directory on the desktop took something like 20 seconds, because of all the DLL's that had to be loaded up to implement all the layers between the GUI and the actual filesystem. The very same FAT32 filesystem, at the DOS prompt, could do the same task in a fraction of a second. Sure, the actual creating the directory didn't take any longer, but it was a very bad design to split such a basic function off and route it through so many layers.

That bad code is still there, but hardware has gotten so much faster, and other optimizations have obscured it (pre-loading the DLL's, for example), so nobody notices the waste. That's why I grumble about abstraction, because it encourages people to be sloppy and lazy, because they CAN just use other people's black boxes, and they know nobody can see into their own black box.

Used properly, it's a godsend… but looking around, I think it's used to avoid work far more often than to make things work better.
20 Jan, 2009, David Haley wrote in the 73rd comment:
Votes: 0
I think it's really a stretch to blame the general notion of abstraction for all the evils of bad programmers. I would argue that in fact abstraction is difficult to get right, which is why many people who are not the best software engineers get it wrong. So, in some sense, there are many examples of poor abstractions, because there are many examples of poor programmers. It's not as if there was some kind of "Golden Age" of programming in the past where everybody wrote perfect code. There are lots of examples of crappy code from before the days of "abstraction" – it's just that today there are a lot more people writing code than back then.

I can't help but think that abstraction is being blamed for things that aren't its fault. Blame bad programmers for their bad code: don't blame something just because people do dumb things with it. (That's kind of like blaming the kitchen knife because people can cut their fingers with it…)

quixadhal said:
Yet I've heard horror stories of teachers marking off for people who choose the simpler path, because they *could* have made it more object oriented.

This kind of "horror story" needs context. If I gave a student a very specific exercise, and they did something else, then yes, they would get a bad grade because they didn't do what they were asked to do. If the question is to "solve something in the simplest way possible", then you would be correct that there's a problem here. But if the task is to do something in an object oriented fashion, then a student who does something else deserves to lose every point taken off.

Besides, oftentimes for people just learning to write code, the "simplest solution" is often the least reusable, the hardest to maintain, bug-prone, and so forth. I would be surprised if new students were producing code so elegant in its simplicity that there would be nothing bad to say about it. :wink:
20 Jan, 2009, elanthis wrote in the 74th comment:
Votes: 0
DavidHaley said:
elanthis said:
It really depends. Yes, abstractions can be good. Sometimes you can over-abstract, or start writing APIs that solve problems that people aren't having. Linux lately has had a LOT of these kinds of things… look at the udev crap that is shoved down our throats now

I think that your argument resembles the opposite one that "somebody" was making w.r.t. Linus's dislike of C++: just because some people do stupid things with something doesn't mean that that something is itself stupid. :wink:


What? I said that abstracting can be good, but doing it for no reason is bad. How is that anything like saying that abstractions are always bad??
20 Jan, 2009, Chris Bailey wrote in the 75th comment:
Votes: 0
I think the three of you are basically saying the same thing in different ways in regards to abstraction. I could of course be misunderstanding, but that's my take on it. Now get back to arguing about something else, and quick, I'm learning here. =)
20 Jan, 2009, David Haley wrote in the 76th comment:
Votes: 0
Sorry Elanthis, I guess I was replying to what you said in the context of what Quix was saying (as I read it, at least). Since you were replying to my reply to his post, it sounded to me like you were defending the position that abstraction today is generally a bad thing.

I think Chris may be right: I think we are all by now vehemently agreeing that used properly abstraction is a good thing, and used improperly, it's not. (Of course, that is such a tautological statement that I thought people were making bigger claims than just that. :tongue:)
20 Jan, 2009, quixadhal wrote in the 77th comment:
Votes: 0
Chris Bailey said:
I'm learning here. =)


HEY! HEY!!! We'll have none of THAT around here Mister!
60.0/77