Maker Pro
Maker Pro

Re: Intel details future Larrabee graphics chip

J

Joerg

Jan 1, 1970
0
Jan said:
When I had the TV shop, some guy told me:
Go look at the competition, see how they do it.
And my reply was, and still is: Let the competition look at me :)


So, HOW do you think you will learn to program better by reading about how it is
done wrong?


By learning about mistakes that happened and about their consequences. I
learn a lot when I disect a design tossed over my fence with the request
to find a solution and with comments such as "we can't reliably produce
this in quantities" or "we have this, that and the other problem with it
in the field".

[...]
 
W

Wilco Dijkstra

Jan 1, 1970
0
Ken Hagan said:
Whilst not disputing that hardware folks probably come to the table
with a rather different attitude to bugs...

Actually, despite this, hardware is full of bugs as well. It's just that most
people don't notice them. A lot of work in the BIOS and OSes goes into
making such bugs invisible to the average user. Hardware typically has
a long testing cycle with many revisions and bugfixes before release.
This catches most of the obvious issues, but hardware is by no means
bug free. A typical CPU has 10-100 known bugs (many of which are
worked around in software) and probably a similar amount of unknown
ones. If things go wrong, people usually blame the software...
Given a problem to be solved by a combination of hardware and software,
the hardware is usually used *merely* (ducks for cover) to provide a
set of primitives which the software will then stick together in a
more complex way. The software would then be solving a more complex
or less well specified problem and might reasonably be expected to
have more of the bugs.

Also hardware solves a well defined problem with a limited set of inputs
and outputs that is invariably well specified and not subject to changing
requirements. Given such a well specified problem it is actually easier to
implement it in software with 100% correctness. Consider floating point
emulation in software as an example of such a problem.

The problems software solves are usually larger and more complex, often
without a detailed specification and, worse, constantly changing requirements.
Such problems are not even considered for hardware implementation, but
it is the reality for a lot of software.

Wilco
 
W

Wilco Dijkstra

Jan 1, 1970
0
Martin Brown said:
At last something that we can agree on. Too much software is not properly designed. Modern compilers these days have a
fair amount of static testing built in, but nothing like enough to solve the problem of below average coders and poor
or no specifications or design documents.

It will draw fire from the C is always wonderful crowd but I expect you can live with that. And it isn't strictly the
C language that is at fault. There is plenty of good well documented software written in C.

Exactly. A poor programmer is going to be a poor programmer no matter
what language they use. It's always fun to see how people believe that
so called "safe" languages are really a lot "safer". The bugs just move
elsewhere. For example garbage collection solves various pointer problems
that inexperienced programmers make. However it creates a whole new
set of problems. Or the runtime system or libraries are much bigger and so
contain their own set of bugs on each platform etc.
And there are languages from the Pascal, Algol and Ada families that are intrinsically better suited for robust
engineering development. But the market chose C/C++. Ada is a bit too top heavy for my taste.

I wouldn't mention Pascal as the standard version doesn't even support
modules etc. Newer variants like Oberon are better in that respect.
I agree on Ada. None of the existing languages are perfect, they all have
advantages and disadvantages. As long as there is no believable
replacement for C/C++, there is little incentive to switch.

Wilco
 
N

Nick Maclaren

Jan 1, 1970
0
|>
|> > At last something that we can agree on. Too much software is not properly designed. Modern compilers these days have a
|> > fair amount of static testing built in, but nothing like enough to solve the problem of below average coders and poor
|> > or no specifications or design documents.
|> >
|> > It will draw fire from the C is always wonderful crowd but I expect you can live with that. And it isn't strictly the
|> > C language that is at fault. There is plenty of good well documented software written in C.
|>
|> Exactly. A poor programmer is going to be a poor programmer no matter
|> what language they use. It's always fun to see how people believe that
|> so called "safe" languages are really a lot "safer". The bugs just move
|> elsewhere. For example garbage collection solves various pointer problems
|> that inexperienced programmers make. However it creates a whole new
|> set of problems. Or the runtime system or libraries are much bigger and so
|> contain their own set of bugs on each platform etc.

There are at least two major classes of bug where it is false, and
C does especially badly:

1) Bugs introduced by an inconsistent, ambiguous or unreasonable
standard.

2) Bugs which are provably easy to detect (e.g. array bound or
arithmetic overflow).


Regards,
Nick Maclaren.
 
W

Wilco Dijkstra

Jan 1, 1970
0
Nick Maclaren said:
|>
|> > At last something that we can agree on. Too much software is not properly designed. Modern compilers these days
have a
|> > fair amount of static testing built in, but nothing like enough to solve the problem of below average coders and
poor
|> > or no specifications or design documents.
|> >
|> > It will draw fire from the C is always wonderful crowd but I expect you can live with that. And it isn't strictly
the
|> > C language that is at fault. There is plenty of good well documented software written in C.
|>
|> Exactly. A poor programmer is going to be a poor programmer no matter
|> what language they use. It's always fun to see how people believe that
|> so called "safe" languages are really a lot "safer". The bugs just move
|> elsewhere. For example garbage collection solves various pointer problems
|> that inexperienced programmers make. However it creates a whole new
|> set of problems. Or the runtime system or libraries are much bigger and so
|> contain their own set of bugs on each platform etc.

There are at least two major classes of bug where it is false, and
C does especially badly:

1) Bugs introduced by an inconsistent, ambiguous or unreasonable
standard.

It's certainly true the C standard is one of the worst specified. However most
compiler writers agree about the major omissions and platforms have ABIs that
specify everything else needed for binary compatibility (that includes features
like volatile, bitfield details etc). So things are not as bad in reality.
2) Bugs which are provably easy to detect (e.g. array bound or
arithmetic overflow).

That is an argument about what exactly should be part of a language and what
should be part of the libraries. Automatic strings, checked arrays and overflow
arithmetic are all things that are/can be supported easily in C/C++. There are
certainly advantages of having this built into the language, however there are
drawbacks too.

Wilco
 
J

JosephKK

Jan 1, 1970
0
A lot of hardware sorts of stuff, like tcp/ip stack accelerators,
coule be done in a dedicated cpu. Sort of like using a PIC to blink an
LED. Part of the channel-controller thing was driven by mot wanting to
burden an expensive CPU with scut work and interrupts and context
switching overhead. All that stops mattering when cpu's are free. Of
course, disk controllers and graphics processors would still be
needed, but simpler ones and fewer of them.

Multicore is especially interesting for embedded systems, where there
are likely a modest number of processes and no dynamic add/drop of
tasks. The most critical ones, like an important servo loop, could be
dedicated and brutally simple. Freescale is already going multicore on
embedded chips, and I think others are, too. The RTOS boys are *not*
going to like this.

John

Between that and the tricks you can do with FPGAs, most certainly.
 
J

JosephKK

Jan 1, 1970
0
Guy said:
John said:
Guy Macon <http://www.GuyMacon.com/> wrote:

Kim Enkovaara wrote:
At least in the high-end of the embedded systems processor updates and
model changes are quite frequent. The lifetimes of processors
and their peripherials (especially DRAM memories) is becoming shorter
all the time. The code has to be portable and easily adaptable to
different platforms.
And at the very low end, changes to completey different processors
are also very common. If someone comes up with a micro that costs
8.4 cents and replaces a part that costs 8.5 cents, that's a saving
of $16,800 per week at a production rate of 100,000 units per hour.
After a while you get the attitude of "ho hum, another assembly
language instruction set."
Assembly? You don't program musical greeting cards and blinking
sneakers in Python?

Actually, I use the BEST programming language.

BEST is a programming language that I developed
to answer the frequently asked question "Which
programming language is best?" once and for all.

BEST is a Befunge-93[2] pseudocompiler written in
Malborge[3][6] with library calls to routines written
in Microsoft[4] Visual BogusFORTH+++[7] (!Kung edition)[9]
that invoke various trinary functions written in[5]
Reverse Polish Whitespace (for clarity).[1]

You will just love Unlambda then...
http://www.madore.org/~david/programs/unlambda/#what_is
;-)

Simplified, Turing complete but devlishly hard to do anything in.

Regards,
Martin Brown

Sounds similar to APL.
 
J

JosephKK

Jan 1, 1970
0
A single database file, of fixed-length records, programmed in
PowerBasic, direct linear search to look things up. Fast as hell and
bulletproof.

Only for fairly small numbers of records, not more than a few
thousand. Try it with a modest database with say several million
records. Big databases have several relational files with over a
billion records. Data warehouses hit trillions of records in high
tens of thousands of files and more. PowerBasic simply will not scale
that high, nor will linear searches.

On the other hand, for a few hundred records i have pressed word
processing programs and spreadsheets into service and made them work
nicely.
 
C

Chris M. Thomasson

Jan 1, 1970
0
Terje Mathisen said:
You, my friend, have sick mind.

OTOH, the first macro language I came up with, for a terminal
emulator/file transfer program, was only slightly more readable. :-(

Ouch.

[...]
 
N

Nick Maclaren

Jan 1, 1970
0
|>
|> It's certainly true the C standard is one of the worst specified. However most
|> compiler writers agree about the major omissions and platforms have ABIs that
|> specify everything else needed for binary compatibility (that includes features
|> like volatile, bitfield details etc). So things are not as bad in reality.

Er, no. I have a LOT of experience with serious code porting, and
am used as an expert of last resort. Most niches have their own
interpretations of C, but none of them use the same ones, and only
programmers with a very wide experience can write portable code.

Note that any application that relies on ghastly kludges like
autoconfigure is not portable, not even remotely. And the ways
in which that horror is used (and often comments in its input)
shows just how bad the C 'standard' is. Very often, 75% of that
is to bypass deficiencies in the C standard.

A simple question: have you ever ported a significant amount of
code (say, > 250,000 lines in > 10 independent programs written
by people you have no contact with) to a system with a conforming
C system, based on different concepts to anything the authors
were familiar with? I have.


Regards,
Nick Maclaren.
 
C

Chris M. Thomasson

Jan 1, 1970
0
Terje Mathisen said:
The most common serious problem I've seen with programs written in GC
languages is when they try to do any kind of external communication:

I.e. using Oracle database handles as if they were an infinite resource
which the GC will always be able to clean up is a very good way to crash
all the web front ends that share a common back-end DB.

I thought GC was the silver-bullet of memory management!

lol.
 
C

Chris M. Thomasson

Jan 1, 1970
0
Nick Maclaren said:
|>
|> It's certainly true the C standard is one of the worst specified.
However most
|> compiler writers agree about the major omissions and platforms have
ABIs that
|> specify everything else needed for binary compatibility (that includes
features
|> like volatile, bitfield details etc). So things are not as bad in
reality.

Er, no. I have a LOT of experience with serious code porting, and
am used as an expert of last resort.

I hope you get paid well! Seriously.
 
N

Nick Maclaren

Jan 1, 1970
0
|>
|> > Er, no. I have a LOT of experience with serious code porting, and
|> > am used as an expert of last resort.
|>
|> I hope you get paid well! Seriously.

Not in this market-driven world of ours! Very, very, few technical
people do - at least compared with comparable people in marketing
or the adminisphere.


Regards,
Nick Maclaren.
 
K

krw

Jan 1, 1970
0
|>
|> > Er, no. I have a LOT of experience with serious code porting, and
|> > am used as an expert of last resort.
|>
|> I hope you get paid well! Seriously.

Not in this market-driven world of ours! Very, very, few technical
people do - at least compared with comparable people in marketing
or the adminisphere.

What do you consider "paid well"?
 
B

Bernd Paysan

Jan 1, 1970
0
Martin said:
It might be an interesting academic study to see how the error rates of
hardware engineers using VHDL compare with those using Verilog tools for
the same sorts of design. The latter I believe is less strongly typed.

Well, first of all, Verilog has way less types. There are only bits, bit
vectors, 32 bit integers, and floats. You can't use the latter for
synthesis; usually only bits and bit vectors are used for register data
types.

My experience is that people make way less errors in Verilog, because it's
all straight-forward, and not many traps to fall in. E.g. a typical VHDL
error is that you define an integer subrange from 0..F, instead of a 4 bit
vector, and then forget to mask the add, so that it doesn't wrap around but
fails instead.

My opinion towards good tools:

* Straight forward operations
* Simple semantics
* Don't offer several choices where one is sufficient
* Restrict people to a certain common style where the tool allows choices
 
B

Bernd Paysan

Jan 1, 1970
0
John said:
A lot of hardware sorts of stuff, like tcp/ip stack accelerators,
coule be done in a dedicated cpu. Sort of like using a PIC to blink an
LED. Part of the channel-controller thing was driven by mot wanting to
burden an expensive CPU with scut work and interrupts and context
switching overhead. All that stops mattering when cpu's are free. Of
course, disk controllers and graphics processors would still be
needed, but simpler ones and fewer of them.

Come on, when CPUs are almost free, dedicated IO CPUs are still a lot
cheaper. You can have more of them in the same die area. They might still
have the same basic instruction set, just with different performance
tradeoff.

You might put a few fast cores on the die, which give you maximum
performance for single-threaded applications. Then, you put a number of
slower cores on it, for maximum multi-threaded performance. And then,
another even slower and simpler type of core for IO.

When cores are cheap, it makes sense to build them for their purpose.
 
W

Wilco Dijkstra

Jan 1, 1970
0
Nick Maclaren said:
|>
|> It's certainly true the C standard is one of the worst specified. However most
|> compiler writers agree about the major omissions and platforms have ABIs that
|> specify everything else needed for binary compatibility (that includes features
|> like volatile, bitfield details etc). So things are not as bad in reality.

Er, no. I have a LOT of experience with serious code porting, and
am used as an expert of last resort. Most niches have their own
interpretations of C, but none of them use the same ones, and only
programmers with a very wide experience can write portable code.

Can you give examples of such different interpretations? There are a
few areas that people disagree about, but it often doesn't matter much.

Interestingly most code is widely portable despite most programmers
having little understanding about portability and violating the C standard in
almost every respect.
Note that any application that relies on ghastly kludges like
autoconfigure is not portable, not even remotely. And the ways
in which that horror is used (and often comments in its input)
shows just how bad the C 'standard' is. Very often, 75% of that
is to bypass deficiencies in the C standard.

Actually you don't need any "autoconfiguring" in C. Much of that was
needed due to badly broken non-conformant Unix compilers. I do see
such terrible mess every now and again, with people declaring builtin
functions incorrectly as otherwise "it wouldn't compile on compiler X"...

Properly sized types like int32_t have finally been standardized, so the
only configuration you need is the selection between the various extensions
that have not yet been standardized (although things like __declspec are
widely accepted nowadays).
A simple question: have you ever ported a significant amount of
code (say, > 250,000 lines in > 10 independent programs written
by people you have no contact with) to a system with a conforming
C system, based on different concepts to anything the authors
were familiar with? I have.

I've done a lot of porting and know most of the problems. It's not nearly
as bad as you claim. Many "porting" issues are actually caused by bugs
and limitations in the underlying OS. I suggest that your experience is
partly colored by the fact that people ask you as a last resort.

Wilco
 
N

Nick Maclaren

Jan 1, 1970
0
|>
|> > |> It's certainly true the C standard is one of the worst specified. However most
|> > |> compiler writers agree about the major omissions and platforms have ABIs that
|> > |> specify everything else needed for binary compatibility (that includes features
|> > |> like volatile, bitfield details etc). So things are not as bad in reality.
|> >
|> > Er, no. I have a LOT of experience with serious code porting, and
|> > am used as an expert of last resort. Most niches have their own
|> > interpretations of C, but none of them use the same ones, and only
|> > programmers with a very wide experience can write portable code.
|>
|> Can you give examples of such different interpretations? There are a
|> few areas that people disagree about, but it often doesn't matter much.

It does as soon as you switch on serious optimisation, or use a CPU
with unusual characteristics; both are common in HPC and rare outside
it. Note that compilers like gcc do not have any options that count
as serious optimisation.

I could send you my Objects diatribe, unless you already have it,
which describes one aspect. You can also add anything involving
sequence points (including functions in the library that may be
implemented as macros), anything involving alignment, when a library
function must return an error (if ever) and when it is allowed to
flag no error and go bananas. And more.

|> Interestingly most code is widely portable despite most programmers
|> having little understanding about portability and violating the C standard in
|> almost every respect.

That is completely wrong, as you will discover if you ever need to
port to a system that isn't just a variant of one you are familiar
with. Perhaps 1% of even the better 'public domain' sources will
compile and run on such systems - I got a lot of messages from
people flabberghasted that my C did.

|> Actually you don't need any "autoconfiguring" in C. Much of that was
|> needed due to badly broken non-conformant Unix compilers. I do see
|> such terrible mess every now and again, with people declaring builtin
|> functions incorrectly as otherwise "it wouldn't compile on compiler X"...

Many of those are actually defects in the standard, if you look more
closely.

|> Properly sized types like int32_t have finally been standardized, so the
|> only configuration you need is the selection between the various extensions
|> that have not yet been standardized (although things like __declspec are
|> widely accepted nowadays).

"Properly sized types like int32_t", forsooth! Those abominations
are precisely the wrong way to achieve portability over a wide range
of systems or over the long term. I shall be dead and buried when
the 64->128 change hits, but people will discover their error then,
oh, yes, they will!

int32_t should be used ONLY for external interfaces, and it doesn't
help with them because it doesn't specify the endianness or overflow
handling. And not all interfaces are the same. All internal types
should be selected as to their function - e.g. array indices, file
pointers, hash code values or whatever - so that they will match the
system's properties. As in Fortran, K&R C etc.

|> > A simple question: have you ever ported a significant amount of
|> > code (say, > 250,000 lines in > 10 independent programs written
|> > by people you have no contact with) to a system with a conforming
|> > C system, based on different concepts to anything the authors
|> > were familiar with? I have.
|>
|> I've done a lot of porting and know most of the problems. It's not nearly
|> as bad as you claim. Many "porting" issues are actually caused by bugs
|> and limitations in the underlying OS. I suggest that your experience is
|> partly colored by the fact that people ask you as a last resort.

Partly, yes. But I am pretty certain that my experience is a lot
wider than yours. I really do mean different CONCEPTS - start with
IBM MVS and move on to a Hitachi SR2201, just during the C era.

Note that I was involved in both the C89 and C99 standardisation
process; and the BSI didn't vote "no" for no good reason.


Regards,
Nick Maclaren.
 
J

Jan Panteltje

Jan 1, 1970
0
"Properly sized types like int32_t", forsooth! Those abominations
are precisely the wrong way to achieve portability over a wide range
of systems or over the long term.

I dare say you show clue-lessness
I shall be dead and buried when
the 64->128 change hits, but people will discover their error then,
oh, yes, they will!

No, int32_t and friends became NECESSARY when the 32 to 64 wave hit,
a simple example, and audio wave header spec:
#ifndef _WAVE_HEADER_H_
#define _WAVE_HEADER_H_

typedef struct
{ /* header for WAV-Files */
uint8_t main_chunk[4]; /* 'RIFF' */
uint32_t length; /* length of file */
uint8_t chunk_type[4]; /* 'WAVE' */
uint8_t sub_chunk[4]; /* 'fmt' */
uint32_t length_chunk; /* length sub_chunk, always 16 bytes */
uint16_t format; /* always 1 = PCM-Code */
uint16_t modus; /* 1 = Mono, 2 = Stereo */
uint32_t sample_fq; /* Sample Freq */
uint32_t byte_p_sec; /* Data per sec */
uint16_t byte_p_spl; /* bytes per sample, 1=8 bit, 2=16 bit (mono) 2=8 bit, 4=16 bit (stereo) */
uint16_t bit_p_spl; /* bits per sample, 8, 12, 16 */
uint8_t data_chunk[4]; /* 'data' */
uint32_t data_length; /* length of data */
} wave_header;

#endif /* _WAVE_HEADER_H_ */


Now in the OLD version it used 'int', and when 'int' changed size again (in bits) of
course the whole structure was different.
I am so happy with uint8_t, if you are closer to hardware you will understand why.
You may claim that that is an 'external' interface, so be it,
but it is nice to be constantly aware of the width of variables.

int32_t should be used ONLY for external interfaces, and it doesn't
help with them because it doesn't specify the endianness or overflow
handling. And not all interfaces are the same. All internal types
should be selected as to their function - e.g. array indices, file
pointers, hash code values or whatever - so that they will match the
system's properties. As in Fortran, K&R C etc.

The _t is great, as it is just that, that creates portability:
From libc.info:
- Function: int fseeko (FILE *STREAM, off_t OFFSET, int WHENCE)
This function is similar to fseek' but it corrects a problem with
fseek' in a system with POSIX types. Using a value of type ong
int' for the offset is not compatible with POSIX. fseeko' uses
the correct type f_t' for the OFFSET parameter.
Note that I was involved in both the C89 and C99 standardisation
process; and the BSI didn't vote "no" for no good reason.

So, when we move to 128 bits (if ever), at least my programs should still work.

You know, I do not like people being pedantic, maybe because they often are right,
and make your code look silly or bad.
I try to code in the lowest subset common denominator of C, avoiding exotic constructs.
That goes a long way, so far most compilers swallow it, but ultimately libc with libc.info
is my reference.
The whole world will soon move to Unix and gcc anyways, even John Larkin will learn C...

<disclaimer>
This posting contains forward looking statements that may or may not be true.
 
J

Joerg

Jan 1, 1970
0
John said:
Right now, we have about 4800 different parts in stock, and about 600
parts lists (BOMs). Why use SQL on that? Why make a monster out of a
simple problem? Searches are so fast you can't see them happen, and
there's no database maintanance, no linked lists to get tangled, no
index files.

Way to go. Although Access could also do that with next to nothing in
programming effort. Just set up the fields, some practical queries, the
reports you regularly need, done.
 
Top