Maker Pro
Maker Pro

Re: Intel details future Larrabee graphics chip

A

AnimalMagic

Jan 1, 1970
0
On a sunny day (Sun, 10 Aug 2008 15:02:40 GMT) it happened Jan Panteltje

And for the others: Sony was to have two HDMI ports on the PS3,
should have made for interesting experiments.

But the real PS3 only had one, so I decided to skip on the Sony product
(most Sony products I have bought in the past were really bad actually).
And Linux you can run on anything (and runs on anything),

Where do you hook up the keyboard and monitor to your wireless LAN
Router, running Linux at?

You're a fucking retard, boy.
for less then
the cost of a PS3 you can assemble a good PC, so if you must run Linux
why bother tortuing yourself on a PS3? Use a real computer.

A PS3 is a real computer, you retarded piece of shit. It happens to
require high end displays. It happens to play Blu Ray movie titles, but
aside from that, it is a VERY cheap PC, as it were.


So you assemble what you call a good PC for less than the cost of a
PS3, and you stick it right up your obviously obese, retarded ass.
But perhaps if you are one of those gamers... well
the video modes also suck on that thing.

No, they do not, not that a dopey **** like you could even understand
what a video array is.
And the power consumption is high,

Really. Do you even know the consumption rate of a PS3, boy?
not green at all,

As if you even knew what green is.
and it does not have that nice Nintendo remote.
:)))))))))))))))))))))))))))))))))))))))))


Jeez, now I know you are a retarded ****. The PS3 camera does the same
thing. Oh boy... spatial realism. Not that a pissy little boy like you
that claims the video modes "suck" would even know what that is about.


You are a fucking joke, asswipe.
 
J

Jan Panteltje

Jan 1, 1970
0
What does C have to do with it, other than being a contributor to the
chaos that modern computing is? More big programming projects fail
than ever make it to market. OS's are commonly shipped with hundreds
or sometimes thousands of bugs. Serious damage to consumers, business,
and US national security has been compromised through the criminally
stupid design of Windows. Lots of people are refusing to upgrade their
apps because the newer releases are bigger, slower, and more fragile
than the older ones. In products with hardware, HDL-based logic, and
firmware, it's nearly always the firmware that's full of bugs. If
engineers can write bug-free VHDL, which they usually do, why can't
programmers write bug-free C, which they practically never do?

Things are broken, and we need a change. Since hardware works, and
software doesn't, we heed more of the former with more control over
less of the latter. Fortunately, that *will* happen, and multicore is
one of the drivers.



I have stated no theories. I have observed that the number of cores
per CPU chip is increasing radically, that Moore's law has
repartitioned itself away from raw CPU complexity and speed into
multiple, relatively modest processors. That this is happening across
the range of processors, scientific and desktop and embedded. Are you
denying that this is happening?

If not, do you have any opinions on whether having hundreds of fairly
fast CPUs, instead of one blindingly-fast one, will change OS design?
Will it change embedded app design?

If you have no opinions, and can conjecture no change, why do you get
mad at people who do, and can? Why do you post in a group that has
"design" in its name? Maybe you should start and moderate
sci.electronics.tradition.

John

HI JOHN
elwctronics design is not (!= in C ;-) ) software design.
Just stating there will be more cores on a chip is obvious,
we knew that for years.

Stating that more cores will improve _reliability_
(in the widest sense of the word) as you seem to
(at least that is what I understand from your postings),
puts the burden of proof on you.

You call software bad, yet you claim your own small asm programs are perfect,
this makes one suspicious.

There is a lot of good software, I would say that software that does what
it is intended to do, and does that without crashing, is good software.
If that software runs on good hardware you can do a lot with it.
All the problems with MS operating system are alien to me, the last MS OS I bought was win98SE, I
still have it on a PC, and it does occasionally misbehave, use it
for my Canon scanner, and DVD layout sometimes.
I will not go online with it.....
All other things run various versions / distributions of Linux, think
I have tried most of these, all but RatHead worked OK.

So I do not really see your problem, things do not crash,
the soft I wrote myself does not crash,
things do not get infected with trojans, virusses, worms, other things...
I have a very good firewall (iptables), latest DNS fixes, this server has now
been running since 2004, still with the same Seagate harddisk...

What is your problem?
As to computer languages, the portability of C will help you out big time
once you want to run that same stable application on say a MIPS platform,
or any other processor.
Re-writing your code in ASM for each new platform is asking for bugs,
so C is an universal solution.
Especially for more complex programs.
AND operating systems.
 
J

Jan Panteltje

Jan 1, 1970
0
You're an idiot.


I never attempted to hack it, dipshit.

So you wait for others to do it for you?
Then you can 'do the little trick' like a monkey.
You must be posting from alt.monkeys.
Bye
 
J

Jan Panteltje

Jan 1, 1970
0
Where do you hook up the keyboard and monitor to your wireless LAN
Router, running Linux at?

RS232 terminal dummy.
http://panteltje.com/panteltje/wap54g/index.html#wapserver

Actually, I have the RS232 disconnected, as all is working so well.
I use telnet to access the wap server, it is faster, and does
not interfere with normal operations.
It allows me to start and stop processes, get logfiles,
set wireless on and off, etc.
Screenshot simple telnet session to the wap server from an other PC:
ftp://panteltje.com/pub/wap.gif
 
J

Jan Panteltje

Jan 1, 1970
0
I've snipped the rest of your drivel, Jan, because the above says it all.

Before you make even more of an ass of yourself, why not actually to a
Google search on 'monolithic kernel' to get some idea of how it's
actually defined by those who know what they're talking about?

So I did, and the things is not so sharply bounded as it may seem.
You can have a device driver in user space in Linux too, I have done that too.
I am referring to
http://upload.wikimedia.org/wikipedia/commons/d/d0/OS-structure2.svg
I agree my definition of monolitic is slightly different from yours and wikipedia.
If you know what you are talking about I dunno, maybe you do.
 
J

Jan Panteltje

Jan 1, 1970
0
Small programs written by one person can be perfect, and often are. So
why not have the OS kernel be small, and written by one person?

QNX has just made their sources public, you still need a license for commercial use though.
Are you thinking that way?
In the eighties I worked with somebody who really liked QNX, I was into Unix.
Unix in the form of Linux solves a lot of programming problems, but brought the real-time
problem of task switching breaking some MSDOS like apps.
So in a way required more hardware.

Why waste X-prizes on solar cars and suborbital tourism? How about...

$10 million for a firm specification of a multiple-CPU OS architecture
based on a nanokernel design.

Another $10M for public-domain code that implements that kernel.

A final $10M for a working OS using the above.

For a mere $30 million, about 1/5 of what the first release of what NT
cost, we could change the world.

(I think Vista is an attempt at a smaller kernel, but it pays a big
price in overhead.)

Why spend all those millions, we _have_ Linux, it works, and - it is mostly
written in C -, and because of that relatively easy portable to many platforms.
 
A

AnimalMagic

Jan 1, 1970
0
So you wait for others to do it for you?
Then you can 'do the little trick' like a monkey.
You must be posting from alt.monkeys.
Bye


Jeez... If you were any more of a goddamned retard, I'd swear you were
put on this planet to torture the intelligent humans.

You don't even rate sub-human, you little dung heap.
 
A

AnimalMagic

Jan 1, 1970
0
So I did, and the things is not so sharply bounded as it may seem.
You can have a device driver in user space in Linux too, I have done that too.
I am referring to
http://upload.wikimedia.org/wikipedia/commons/d/d0/OS-structure2.svg
I agree my definition of monolitic is slightly different from yours and wikipedia.
If you know what you are talking about I dunno, maybe you do.


Bwuahahahahah! How rich! How... Obama like!

Bwuuuuuuhhahahahahahahahahahahahhahaha!
 
K

Kim Enkovaara

Jan 1, 1970
0
John said:
than the older ones. In products with hardware, HDL-based logic, and
firmware, it's nearly always the firmware that's full of bugs. If
engineers can write bug-free VHDL, which they usually do, why can't
programmers write bug-free C, which they practically never do?

There is no such thing as bug free HDL. The bug density is just usually
lower especially in ASICs. The main reason for that is more thorough
testing of the code, because respins of the chips is slow and expensive.
In FPGAs when you can always do an update the bug densities are much
higher in the beginning.

--Kim
 
K

Kim Enkovaara

Jan 1, 1970
0
John said:
Changing "platforms" in an embedded system is such a hassle that the
code is a fraction of the effort. It's rarely done.

At least in the high-end of the embedded systems processor updates and
model changes are quite frequent. The lifetimes of processors
and their peripherials (especially DRAM memories) is becoming shorter
all the time. The code has to be portable and easily adaptable to
different platforms.
And C is not portable in embedded systems. Assuming it is is begging
for bugs.

C is very portable in embedded systems as far as I have seen. Some very
minimal processors have weird compilers, but the bigger processors
usually have gcc support, and also the commercial compilers support
the C same way as gcc.
Don't do that.

High-end embedded systems can easily contain 10Mloc of code, and that
amount is needed to support all the required features.


--Kim
 
J

Jan Panteltje

Jan 1, 1970
0
I've got a small software project going now to write a new material
control/BOM/parts database system. The way I've found to keep the bugs
under reasonable control is to use one programmer, one programmer
supervisor, and four testers.

John

postgreSQL with phpPgAdmin as frontend here, web based.
What bugs? Yes the bugs in my SQL :)
 
J

Jan Panteltje

Jan 1, 1970
0
A single database file, of fixed-length records, programmed in
PowerBasic, direct linear search to look things up. Fast as hell and
bulletproof.

I agree that for one specific case it is faster and simpler.
But using PostgreSQL as database, not only is it free, it
has a huge user base, so any questions I asked
in the relevant newsgroups were immediately answered.
phpPgAdmin is cool, it practically constructs the database for you,
all you have to do is specify what fields, and type of fields, you want.
I use it for everything, can access it over the web from everywhere,
even for my DVD collection.
You generate some 'views', ones you often need, and one mouse click shows what you looked for.
Also in backup, it takes only a few minutes to backup the whole postgress database,
so company administration, projects, other stuff.. just from the command line.
I mean you only backup the data, very compact.

The only programming involved is the SQL to generate the views,
or to generate a search.
Yes, yet an other language to learn, but universal.
And total cost: zero, open source.
http://phppgadmin.sourceforge.net/
http://www.postgresql.org/


One advice, if you use phppgadmin, put it in a non standard directory on the web server,
there have been many illegal attempts to access it in its standard directories on my machine.
If you put it in:
/usr/local/httpd/htdocs/if_you_guess_this_then_play_the_lottery_more_often/index.php
then they likely won't find it.
Now I am really curious if the suckers will try that directory :)

We've been using my old DOS version for about 10 years, and never
corrupted or lost anything. The new windows-y version is a lot
spiffier, but has the same architecture.

DOS = Denial Of Service ???
LOL.
Even abbreviations change meaning over time.
 
K

Kim Enkovaara

Jan 1, 1970
0
John said:
Never? Every FPGA on the planet has bugs?

If the chip is nontrivial I would say that there is something
hiding inside the chip. Coders are not perfect and neither are
the tests. Also the tool flows have bugs, for example I have
debugged many synthesizer bugs in the past. Asynchronous
interfaces are also good place for finding bugs that might
show up only when the manufacturer updates FPGA process etc.

I have seen bugs found in chips that have been in real use for
10+ years, just some weird condition causes a bug to appear
from thin air.
For some reason, our shipped products rarely have bugs in FPGAs or in
embedded code. But development-time bugs are far fewer in FPGAs as
compared to uP code.

One explanation is that in HDL coding it is normal to do module and
top level testing of the code in simulator. For software module testing
is not normal practice and also the tools for that are quite recent.
One recent product has about 7K lines of VHDL and
7K lines of assembly, and I'd estimate the initial-coding bug level to
be 10 or maybe 20:1 in favor of the VHDL.

7k lines of VHDL is quite small design. I usually work with 10-100x
bigger VHDL/Verilog codebases.
Most of the test instruments I've bought lately seem to have superb
hardware and flakey firmware.

In more complex platforms HW is ~10% of the effort so it is obvious
that the bigger part of the development might have more bugs also.
Software is the critical path, HW side usually has time to do more
testing and has more labtime to sort out bugs.


--Kim
 
K

Ken Hagan

Jan 1, 1970
0
Whilst not disputing that hardware folks probably come to the table
with a rather different attitude to bugs...

Given a problem to be solved by a combination of hardware and software,
the hardware is usually used *merely* (ducks for cover) to provide a
set of primitives which the software will then stick together in a
more complex way. The software would then be solving a more complex
or less well specified problem and might reasonably be expected to
have more of the bugs.

A second effect is that any deficiencies in the chosen set of primitives
is likely to lead to bugs that get blamed on software rather than the
design. Even if the correct attribution is made, software usually *is*
easier to change (and later in the development schedule) so managers
decide to "fix it in software".

A third effect is that *application* software has so many dependencies
on code written by people you have never met and who have no interest
in your welfare. If we are to compare bug rates in hard/firm/soft-ware
then we'd need to direct our gaze at things like interrupt handlers
and the lowest levels of device drivers.

A fourth effect is that customer expectations are different (and software
often *can* be changed after purchase) and so the commercially sensible
decision is to be first to market and accept a non-zero level of bugs.
 
J

Jan Panteltje

Jan 1, 1970
0
Programmers tend to whip out a lot of code fast, get it to compile by
trial-and-error

Maybe you should meet more programmers.

, then run it and look for bugs;

.... when I write code I do not look for bugs, I test if it does what it has to do.
That may leave subtle things sometimes, you would not find those with a debugger either,
some clever person inspecting your OPEN SOURCE code may email you.


or preferably, let
somebody else look for the bugs.

If real bugs are present, things that only happen in specific circumstances,
user feed back is important.
Often problems arise when a person uses a later version of the kernel, with
a later compiler version, on a different platform....
Is that a bug? I dunno, but if it is OPEN SOURCE then it can be easily fixed.
If it is closed source like some old Linux games I have, they no longer run...

Most C code is virtually unreadable,

As I mentioned before, Greek is very readable to the Greek.
If you do not have experience with C, then it will be not very readable to you.

uncommented and undocumented,

There is a way that you can write self-documenting code in C.
That starts with using sane names for funtions, so instead of writing

int aaaa(int ac, char *bb, double *flup), one can write

int calculate_speed(int tacho_ticks_per_second, char *sensor_name, double *result_in_km_hour)


or documented wrong.

Normally some extra /* comments */ in the OPEN SOURCE will help understanding program flow a lot.

As a design
methodology, that's crap.

Start writing some C programs, and see if you can stick the above simple rules.

I think a lot of confusion arises that maybe some programmers think that if you write

a=b*c;
it is faster then if you use
a = b * c;
;-)
Sure, spaces take up memory, but not much, they do make the source a lot more readable sometimes.

I'd fire any hardware designer who worked this way. Any programmer,
too.

Improve the world, and start with yourself.



Q
 
J

Jan Panteltje

Jan 1, 1970
0
My code works fine, because I approach it as engineering, and because
I'm not a programmer, and because I hate to program.

Read "Showstopper!" and "Dreaming in Code" to see how the real pros do
it.

When I had the TV shop, some guy told me:
Go look at the competition, see how they do it.
And my reply was, and still is: Let the competition look at me :)


So, HOW do you think you will learn to program better by reading about how it is
done wrong?
You only will learn to program better by looking at great code, and: learning from those coders.
This is why open source is so incredibly important.
Instead of every new programmer starting to make all mistakes all over
again in writing some application, you can build and improve on previous work.

This is why Linux takes that incredible flight, thousands and thousands of individuals
contributing, like one big brain.

Now put that against those few brain washed newby ex-student programmers working in and for Redmond.

You know, I am grateful if somebody takes the trouble of pointing out a flaw in my code.
Sure there is that moment of 'aaarg, what was this' (old code maybe years old, you need to look it up and regain the
understanding of the whole flow), but it (the code) lives on, it will outlive you,
it will teach new people, they will further improve it.

What a difference with the sad mouse-clickers in Redmond, who will be obsolete with the next
feature limited version of their OS.

R
 
J

Jan Panteltje

Jan 1, 1970
0
As I've said, my programs work fine. I have no immediate need to
"learn to program better", any more than I need to learn to solder
better. But I am fascinated by technology disasters of all kinds, and
modern "computer science" has given us lots of juicy examples.

And studying failure is a valuable technique for avoiding same.
Sure.


I recently evaluated three general-purpose drawing programs, one
public-domain and two commercial. All three were horrors, undrivable
or buggy as sin. Open Office Draw looks OK, if you don't mind the 127
Mbyte download.

John

be careful, one thing that sort of did hit me was this posting:
http://groups.google.com/group/rec.video.production/browse_thread/
thread/697ce9df60d62731/8f540eb5adcef401?lnk=gst&q=wits#8f540eb5adcef401

(I have folded the URL as the server complained last time it was too long).

the poor guy had a defective memory module in his PC, he found all programs sucked,
all his work for nothing, anyways after somebody pointed him to a memory test
program, and he finally replaced the defective memory, and then I advised him
to re-install his stuff, which he did not immediately do.. anyways
the problems then later disappeared.

Many modern programs (like drawing programs, you are not specific so what
do I know) have complex interfaces, and need considerable time to get used to,
and you need time to even learn where to find the functions, etc.
In the same way, one who is used to one make video-editor, may flip out completely
on the user interface of one made by some other company.

For some of those softwares you may need MONTH of training before you can even claim to
be a user.

A simple example: Blender http://www.blender.org/ , have it on my system for well,
more then 8 years? from its beginning, it can do amazing things, even made leaders for VHS productions
with it, it gets more complex all the time, and learning to use it takes more
then a few days, not even counting your artistic capabilities.
So I am not buying : 'That software sucks'.
If it crashes, OK that is bad.

Anyways, you jump from one issue to the other, I'd say: relax, maybe the sun even shines over there,
here it is rainy, do not get all purked up by soft you do not like.
In the very beginning of Linux (I started with SLS Linux (Where SLS stood for 'Soft Landing Systems') ),
if I needed an application, then often it simply was not there, or was not the way I wanted, so wrote it
myself.
I am sure deep in your heart you _do_ like programming, I have seen your enthusiasm for your
jump tables in asm, so if must be, start writing your own drawing program.
I mentioned Blender, that is how Blender started, those guys at NaN (Not A Number) wanted something
better, they acted.
Else it all is just blah blah and does not help the world, a world of complainers is no good to anybody.
 
Top