I know i did just come up with another explanation for it, but consistency is the hallmark of etc. Anyway, i was reading one of those things from hckrnews and this quote:
Software development obeys the laws of entropy, like any other process. Continuous change leads to software rot, which erodes the conceptual integrity of the original design. Software rot is unavoidable, but programmers who fail to take conceptual integrity into consideration create software that rots so so fast that it becomes worthless before it is even completed. Entropic failure of conceptual integrity is probably the most common reason for software project failure. (The second most common reason is delivering something other than what the customer wanted.) Software rot slows down progress exponentially, so many projects face exploding timelines and budgets before they are mercifully killed.
sparked the conjecture:
Worse is better protects from bit-rot/entropy.
As in it avoids uncool complete code, that must be there but no one would really trigger or read. As it tends to keep each piece of code small and hardy.
But maybe not.
A guy from USAF talking about windows (through OSnews):
They’re still in the model that they want to give all the features enabled to clients, but I think we’ve reached a point where that model is one that is no longer effective. I’m of the opinion that all products ought to be configured with these locked-down configurations, and if the customer decides they want to undo them, then they can do that. They cannot continue fielding products where the cost that is being borne by the consumer in terms of having to maintain configurations and deal with attacks is so high.
This underlines a very important issue, to which i still don’t know which words to apply, so let’s begin by that: the act of configuring a computer has a cost. Read More »
Sam Hughes of qntm.org after some frustration with some transhumanist hypothesized about how creating better-than human AI to create better-than-better-than-human AI and so on is impossible. Big discussion ensues, he ends up closing the post seemingly conceding defeat. I do not agree with his arguments, but i agree wholeheartedly with his conclusions. If that does not prove i am not qualified to talk about intelligence then what does? ;-)
Nevertheless. AI. The subject seems pretty hot ATM. What i think.
Basically, i hold “explosive AI” to be possible, in a theoretical kind of way, but for it to actually happen in our civilization’s time frame would call for tech breakthroughs on par with warp-drive or time-travel. To put it another way: common-AI is already trivial in our present, but this technology can’t progress into the general 42 that transhumanists expect. Read More »
Whenever someone begins to talk about «Usability» you can time the few seconds for them to come up with «simple». For some not-so-strange reason, people think that a «good interface» equals a simple one. Mark Zuckerberg, tyrant-dictator-for-life of Facebook, responded to criticism about FB’s terrible management of privacy with “simpler controls”.
I don’t want to comment on the FB privacy issues (for that, you should as always consult danah boyd, maybe here and here). My pet-peeve right now is the idea of «SIMPLE».
And i was reading Lost Finale reviews and my whole point just vanished from my mind, so i’ll just give you the bullets and you can read a sidetracked maybe promising-but-how-to-continue-? explanation after the “read more”.
Simple interface means you take processing away from it: Therefore, you degrade the user input instead of extracting from it.
The illusion of simplicity comes from orthogonality — but creating orthogonality is hard and can only be done through adding methods and never by sheer cutting.
To strive for simplicity is to search marketing slogans. A consistent design comes from grasping the full picture — even if you do it instinctively.
Read More »