A
AnimalMagic
- Jan 1, 1970
- 0
Hey nutcase,
You're an idiot.
YOU failed to hack it !
I never attempted to hack it, dipshit.
Learn to wipe your own arse.
Grow the **** up, you retarded twit.
Hey nutcase,
YOU failed to hack it !
Learn to wipe your own arse.
On a sunny day (Sun, 10 Aug 2008 15:02:40 GMT) it happened Jan Panteltje
And for the others: Sony was to have two HDMI ports on the PS3,
should have made for interesting experiments.
But the real PS3 only had one, so I decided to skip on the Sony product
(most Sony products I have bought in the past were really bad actually).
And Linux you can run on anything (and runs on anything),
for less then
the cost of a PS3 you can assemble a good PC, so if you must run Linux
why bother tortuing yourself on a PS3? Use a real computer.
But perhaps if you are one of those gamers... well
the video modes also suck on that thing.
And the power consumption is high,
not green at all,
and it does not have that nice Nintendo remote.
))))))))))))))))))))))))))))))))))))))))
What does C have to do with it, other than being a contributor to the
chaos that modern computing is? More big programming projects fail
than ever make it to market. OS's are commonly shipped with hundreds
or sometimes thousands of bugs. Serious damage to consumers, business,
and US national security has been compromised through the criminally
stupid design of Windows. Lots of people are refusing to upgrade their
apps because the newer releases are bigger, slower, and more fragile
than the older ones. In products with hardware, HDL-based logic, and
firmware, it's nearly always the firmware that's full of bugs. If
engineers can write bug-free VHDL, which they usually do, why can't
programmers write bug-free C, which they practically never do?
Things are broken, and we need a change. Since hardware works, and
software doesn't, we heed more of the former with more control over
less of the latter. Fortunately, that *will* happen, and multicore is
one of the drivers.
I have stated no theories. I have observed that the number of cores
per CPU chip is increasing radically, that Moore's law has
repartitioned itself away from raw CPU complexity and speed into
multiple, relatively modest processors. That this is happening across
the range of processors, scientific and desktop and embedded. Are you
denying that this is happening?
If not, do you have any opinions on whether having hundreds of fairly
fast CPUs, instead of one blindingly-fast one, will change OS design?
Will it change embedded app design?
If you have no opinions, and can conjecture no change, why do you get
mad at people who do, and can? Why do you post in a group that has
"design" in its name? Maybe you should start and moderate
sci.electronics.tradition.
John
You're an idiot.
I never attempted to hack it, dipshit.
Where do you hook up the keyboard and monitor to your wireless LAN
Router, running Linux at?
Where do you hook up the keyboard and monitor to your wireless LAN
Router, running Linux at?
I've snipped the rest of your drivel, Jan, because the above says it all.
Before you make even more of an ass of yourself, why not actually to a
Google search on 'monolithic kernel' to get some idea of how it's
actually defined by those who know what they're talking about?
Small programs written by one person can be perfect, and often are. So
why not have the OS kernel be small, and written by one person?
Why waste X-prizes on solar cars and suborbital tourism? How about...
$10 million for a firm specification of a multiple-CPU OS architecture
based on a nanokernel design.
Another $10M for public-domain code that implements that kernel.
A final $10M for a working OS using the above.
For a mere $30 million, about 1/5 of what the first release of what NT
cost, we could change the world.
(I think Vista is an attempt at a smaller kernel, but it pays a big
price in overhead.)
So you wait for others to do it for you?
Then you can 'do the little trick' like a monkey.
You must be posting from alt.monkeys.
Bye
RS232 terminal dummy.
http://panteltje.com/panteltje/wap54g/index.html#wapserver
So I did, and the things is not so sharply bounded as it may seem.
You can have a device driver in user space in Linux too, I have done that too.
I am referring to
http://upload.wikimedia.org/wikipedia/commons/d/d0/OS-structure2.svg
I agree my definition of monolitic is slightly different from yours and wikipedia.
If you know what you are talking about I dunno, maybe you do.
John said:than the older ones. In products with hardware, HDL-based logic, and
firmware, it's nearly always the firmware that's full of bugs. If
engineers can write bug-free VHDL, which they usually do, why can't
programmers write bug-free C, which they practically never do?
John said:Changing "platforms" in an embedded system is such a hassle that the
code is a fraction of the effort. It's rarely done.
And C is not portable in embedded systems. Assuming it is is begging
for bugs.
Don't do that.
I've got a small software project going now to write a new material
control/BOM/parts database system. The way I've found to keep the bugs
under reasonable control is to use one programmer, one programmer
supervisor, and four testers.
John
A single database file, of fixed-length records, programmed in
PowerBasic, direct linear search to look things up. Fast as hell and
bulletproof.
We've been using my old DOS version for about 10 years, and never
corrupted or lost anything. The new windows-y version is a lot
spiffier, but has the same architecture.
John said:Never? Every FPGA on the planet has bugs?
For some reason, our shipped products rarely have bugs in FPGAs or in
embedded code. But development-time bugs are far fewer in FPGAs as
compared to uP code.
One recent product has about 7K lines of VHDL and
7K lines of assembly, and I'd estimate the initial-coding bug level to
be 10 or maybe 20:1 in favor of the VHDL.
Most of the test instruments I've bought lately seem to have superb
hardware and flakey firmware.
Programmers tend to whip out a lot of code fast, get it to compile by
trial-and-error
, then run it and look for bugs;
or preferably, let
somebody else look for the bugs.
Most C code is virtually unreadable,
uncommented and undocumented,
or documented wrong.
As a design
methodology, that's crap.
I'd fire any hardware designer who worked this way. Any programmer,
too.
John
My code works fine, because I approach it as engineering, and because
I'm not a programmer, and because I hate to program.
Read "Showstopper!" and "Dreaming in Code" to see how the real pros do
it.
As I've said, my programs work fine. I have no immediate need to
"learn to program better", any more than I need to learn to solder
better. But I am fascinated by technology disasters of all kinds, and
modern "computer science" has given us lots of juicy examples.
And studying failure is a valuable technique for avoiding same.
Sure.
I recently evaluated three general-purpose drawing programs, one
public-domain and two commercial. All three were horrors, undrivable
or buggy as sin. Open Office Draw looks OK, if you don't mind the 127
Mbyte download.
John