Maker Pro
Maker Pro

Is microprocessor an integrated circuit???

B

Bradley1234

Jan 1, 1970
0
I quickly found 10 bit/byte also:
http://www.bartleby.com/64/C004/010.html

quote:
A sequence of 8 bits is so important that it has a name of its own: a byte.
A handy mnemonic for byte is binary digit eight. Thus, it requires one byte
of information (or 8 bits) to transmit one keystroke on a typical keyboard.
It requires three bytes of information (or 24 bits) to transmit the
three-letter word the.

end quote

4. Science Terms: Distinctions, Restrictions, and Confusions


§ 10. bit / byte





Ken Smith said:
With "10 bit byte" in google. I quickly found a newish (2003) document
that speaks of 10 bit bytes written by IBM of all people.

Care to post a link to that document? Is it the currently accepted industry
standard as I asked?
 
K

keith

Jan 1, 1970
0
If you have a PLA and a ROM in a black box, and are allowed to observe
the outputs only after they have settled, what difference is there
between the two?

Sure, sorta. I can tell a ROM from a PLA, but perhaps not a PLA from a
ROM. Think about a 2D map of inputs vs outputs. A PLA (after arranging
the I/O) will have square holes in the map 2^n in size. A ROM likely
won't. Given a random change of inputs, a ROM will also have a
probabilitiy of a change of outputs greater than that of a PLA. Or put
another way, a PLA contians less information than a ROM.
 
K

keith

Jan 1, 1970
0
Hisssss. Or something.

Ah, man! Even though you live in SF, I always thought your were more of
a herrrrr kinda guy. I'm shocked! Shocked, I tell ya!
 
J

John Larkin

Jan 1, 1970
0
Ah, man! Even though you live in SF, I always thought your were more of
a herrrrr kinda guy. I'm shocked! Shocked, I tell ya!


Oh, go away before I hit you with my purse.

John
 
A

Andrew Holme

Jan 1, 1970
0
Keith said:
A microprocessor without an instruction set is, umm, useless?

My preferred criteria is : Is there a micro-program counter?

On reflection: this makes the 6502 microcoded because the instruction
register and T state counter, which feed into the logic array, can be
thought of as high and low words of a micro-program address.
I can list this "PLA" microcode (I have the power;), but it never
exists in hardware. It's converted by the synthesis tools from VHDL
into gates. What does that make it? Is it microprogrammed? The
source sure looks like a PLA. It's even called a PLA in the source.
It's random logic on the chip though.

Is there anything in your processor that resembles a micro-program counter?
I'm telling you that "thar be dragons" if you insist on defining
things with nice black lines.

Of course; but sometimes simplication / generalisation can be useful /
helpful for getting ideas across.
 
K

Ken Smith

Jan 1, 1970
0
Ken Smith said:
With "10 bit byte" in google. I quickly found a newish (2003) document
that speaks of 10 bit bytes written by IBM of all people.

Care to post a link to that document? Is it the currently accepted industry
standard as I asked?[/QUOTE]

I said from IBM. which is a small typewriter company and actually of
little importance in the computer business, but if you care to read the
document it is at their site as:

www.research.ibm.com/journal/rd/464/stigliani.pdf
 
K

keith

Jan 1, 1970
0
My preferred criteria is : Is there a micro-program counter?

Ah, so what about a processor that has no program counter? Is that not a
processor? Can it not be programmed? Does it not have programs?
On reflection: this makes the 6502 microcoded because the instruction
register and T state counter, which feed into the logic array, can be
thought of as high and low words of a micro-program address.


Is there anything in your processor that resembles a micro-program
counter?

There isn't even a program counter, at least one that exists as an entity
that can be identified as such. It's not a little strange processor off
in the corner of its own little world. It's fairly mainstream. These
concepts of "microprogrammed or hard-wired" are rather old and the
difference is really blury these days.
Of course; but sometimes simplication / generalisation can be useful /
helpful for getting ideas across.

For education, sure. Most of the distinctions we're talking about aren't
really so much of a distinction at all. FPGAs, for example, use the same
SRAMs as "gates" for doing logic operations, registers, and memory.
 
B

Bradley1234

Jan 1, 1970
0
Not that I would defend IBM, its funny you reference them that way. I must
conclude you know a great deal about IBM

It appears in the article that the 8 bit byte is enclosed in a 10 bit
wrapper, probably a parity for every 4 bits?
 
T

TCS

Jan 1, 1970
0
Ah, so what about a processor that has no program counter? Is that not a
processor? Can it not be programmed? Does it not have programs?

PC != MPC
 
K

keith

Jan 1, 1970
0
The ISO appears to sell it for $399. Those kinds of publications are
pricey, thats a fact of life. If someone found it for $18 ? sounds like a
bargain

So buy it and read it. You're in for an eye-opener.
Or maybe Im so smart, so amazingly qualified and so experienced that Im not
afraid to be misinterpreted or misunderestimated

Or so arrogant that you cannot let go of a long held belief in the tooth
fairy after being told by your dentist that they don't exist. "But my
mommy wouldn't lie."
I admit that I want a refund on some of the tuition I paid because the
school was incompetent, I was overqualified. You jump to conclusions,
not very open minded

They were incompetent, obviously. Your education is sorely lacking as a
result. Add that to your suit.
I said an 8 bit byte can represent the range of -127 to +127 and you
said: "Nope"

You didn't say "8 bit byte". You said "byte". Even so, you're still
wrong. An 8 bit byte can store numbers from 0 to 255 (unsigned) or -128
to +127 (signed two's compliment).

The "nope" stands.
Its like arguing that the inch scale of measurement is arbitrary, its
not. With only a few rare exceptions in history, a byte is 8 bits, and
is the standard. We note with disk and memory sizes, they are rated in
Mega or Giga bytes. Network interface speeds in Mega bytes per second.

Its not rare and can be redefined. The definition of a "byte" is dot
fixed at a certain size. Get over it. Learn your lesson and move along.
Western Digital could double their disk size by saying a byte is now 16
bits wide.

If they were selling it to be used in a 16 bit byte machine, sure (wrong
way, but I get your drift). Since they're advertizing it (and it's
formatted for) a system that uses 8 bit bytes they have to be consistent
with the size.

I bet it torques your jaws that they measure the size in
decimal too.
One of the other factors in using the byte system is the Base 2 math,
and unless IM mistaken shifting left or right will double or divide by 2
the binary base 2 number. Designers would therefore prefer a byte that
is a multiple, 2, 4, 8, 16, 32

Six bit bytes were common. IIRC the CDC Ciber series had a 60bit
word and a 6bit byte. As has been noted IBM talks of a 10bit byte in one
of the I/O processors.
But since its already 8 bits, thats how its used.

Nope. It' used by the ignorant to mean eight bits, or it's understood by
the context that it's eight bits. A byte is in no way defined as being
eight bits though.
When a person begins losing a debate and has no declarative or
informative contribution, the discussion degrades into separate and
unique steps, personal attacks (pig stupid), then profanity, then rage,
then violence.

Violence? No. Frustration with a pig-headed troll? Certainly.
If you claim to have understood binary at that specific point in time?
you would be over 100 years old. Congratulations that you are coherent
and not senile

More than half that, but I was doing binary (and in all bases up to 32 -
got awkward above) in fifth grade, well more than forty years ago. You
simply sounded like a snot-nosed kid who thinks he knows everything.

How many bits are in a byte?
Yes I researched the site, the oddball, erroneous, unauthorized uses of
the term "byte" exist. But the SAE measurement of inches has a better
chance to be converted to metric than the byte has a chance of being
other than 8 bits.

You're hopeless. I hope you're not involved in any engineering more
complicated than a toaster.
When a byte is represented as 9 bits, the 8 bits remain as the data the
extra bit is a parity bit, meaning its a type of wrapper component, not
a literal 9-bit word. If anyone in history combined 3 octal digits into
a 9 bit word? its technically not the same.

Did I say anything about parity or prepresentations of binary numbers?
I'm talking about sizeof(byte) *not* being fixed at eight. It is
*usually* equal to eight, but if you assume that it's a fact it will come
back to byte someone. ;-)


IBM? Yes. The official 'C' Standard. Yes. Life? Yes. Get over it,
you're wrong.
Thats what it means, by the majority of the industry, and your failure
to provide any examples in the industry show that I was correct
I've proven your definition wrong by example. There are many
"microprocessors" that are *not* microprogrammed. There are many
microprogrammed processors that are *not* microprocessors. An IBM 360 is
hardly a microprocessor, though the "same architecture" some forty
years later, known as the z-Series are, in fact, microprocessors. Some
models are microprogrammed, some are hard-wired. The terms are othogonal.

Yet you've not shown one authoritative example of a VAX-11/780 being
called a "microprocessor".
Well then you wont want to know Ive contributed for years to make those
things that launch into orbit whatever they are called, go up and keep
the range safe so they fly up there and spin around or whatever they do.

I dont believe a McD's would hire me with my resume, too overqualified

They're afraid you couldn't learn the process.
Okay you win, Im sorry to upset you so much, if you want to say a 6 bit
byte is a byte... no, I wont do it, youre wrong, I cant even pretend to
concede, its an 8 bit byte

You can continue life being ignorant or you can learn. Your choice.
Well if you knew binary over 100 years ago, you should then of course
know about the Hollerith punched card and what 2629 is.

I know what an 026 is, and an 029. I used them in college, and a couple
of times since. 2629 to me is the model number of my laptop (ThinkPad
A21p).
I like punched card equipment, it was fun to work on with the mechanical
stuff and electrical also.

My guess is that you're not talking about a ThinkPad.
Well people do say everyone has to work harder when Im around
To clean up your messes, no doubt.

BTW, a decent newsreader would be a good "investment".
 
K

keith

Jan 1, 1970
0
quote and reply from my post:


Thanks for proving what I said was correct. The 11/780 CPU, meaning its
processor board(s) If I recall there were options, you could have a
floating point unit or some other thing to add also, the cpu would be a
collection of large sized boards, which were..... drum roll please........
microprogrammed

That does *not* make them a microprocessor. That's the point you're
continually missing.
The 11/780 chassis had cpu boards, I/O processor boards, memory boards
and large power supplies, usually had a tape drive, some user console,
some large removable disk most likely. It was a supermini system. The
reference to the cpu meant that part of the system. Its been a long
time since Ive ever dealt with one.

....and in *no* way a microprocessor. Even Pigbladder is losing it over
your silly definition of "microprocessor".
 
K

keith

Jan 1, 1970
0
PC != MPC

Sure, but if a processor can exist without a program counter a
micro-programmed machine can exist without a micro-program counter. A
counter is a means to an end. It's not an end.
 
M

MikeMandaville

Jan 1, 1970
0
Ken Smith said:
At this point if Bradly1234 said the sky was blue, I'd look ­for
myself.

I recall from the movie about Alexander Graham Bell that when Watson
plucked what can be thought of as an accordion reed over what can be
thought of as an electric guitar pickup, Bell remarked that they were
the first people in history to send a musical note over a wire. Watson
replied that this was nonsense, since Bell had only heard what every
electrician had heard at one time or another during the natural course
of his work. The difference, of course, is that Bell had understood
the significance of the event.
 
J

John Larkin

Jan 1, 1970
0
In the 1970s minicomputers had what are now retroactively defined as
"complex" instructions to perform operations like say allocating a
free page of physical memory to a program's virtual address space. In
the Modcomp IV this was the AMEM instruction. Two other "complex"
instructions in that machine were the MMRB (Move Memory to Register
Block) and MRBM (Move Register Block to Memory).

The VAX had POLY: evaluate a polynomial. Most of the transcendental
math functions were microcoded, too.

John
 
J

John Larkin

Jan 1, 1970
0
Sure, but if a processor can exist without a program counter a
micro-programmed machine can exist without a micro-program counter. A
counter is a means to an end. It's not an end.

Some of the low-ends COPS machines used a pseudo-random shift register
instead of a program counter. That reduced prop delays and hardware
complexity, but sequential instructions hopped all over the place. I
know a guy who actually programmed one of these monsters (used it in a
commercial ignition timing strobe, with all sorts of tricky timing
loops) and he's still fairly coherent.

There are a few web sites devoted to designing computer languages and
architectures with the worst possible structures, apparently unaware
that National beat them to it.

John
 
J

John Woodgate

Jan 1, 1970
0
I read in sci.electronics.design that MikeMandaville
..googlegroups.com>) about 'Is microprocessor an integrated circuit???',
I recall from the movie about Alexander Graham Bell that when Watson
plucked what can be thought of as an accordion reed over what can be
thought of as an electric guitar pickup, Bell remarked that they were
the first people in history to send a musical note over a wire. Watson
replied that this was nonsense, since Bell had only heard what every
electrician had heard at one time or another during the natural course
of his work. The difference, of course, is that Bell had understood the
significance of the event.

Shouldn't he be sued, then, for facilitating wholesale breaches of
recording copyrights? (;-)
 
K

keith

Jan 1, 1970
0
Some of the low-ends COPS machines used a pseudo-random shift register
instead of a program counter. That reduced prop delays and hardware
complexity, but sequential instructions hopped all over the place. I
know a guy who actually programmed one of these monsters (used it in a
commercial ignition timing strobe, with all sorts of tricky timing
loops) and he's still fairly coherent.

I'm familliar with a few microprocessors that used LFSRs as program
"counters". It's still a counter, just not a binary counter.
There are a few web sites devoted to designing computer languages and
architectures with the worst possible structures, apparently unaware
that National beat them to it.

;-)

I rather liked their PACE micro though. The first 16 bit micro (though
the ALU was only eight).
 
B

Bradley1234

Jan 1, 1970
0
keith said:
So buy it and read it. You're in for an eye-opener.

Show me where the official version is on sale for $18
Or so arrogant that you cannot let go of a long held belief in the tooth
fairy after being told by your dentist that they don't exist. "But my
mommy wouldn't lie."

The tooth fairy? She is married to Jergen Von Strangle, the toughest fairy
in fairy world

They were incompetent, obviously. Your education is sorely lacking as a
result. Add that to your suit.

Let me ask you a simple question, do you know what BiCMOS is? (not a trick
question)

You didn't say "8 bit byte". You said "byte". Even so, you're still
wrong. An 8 bit byte can store numbers from 0 to 255 (unsigned) or -128
to +127 (signed two's compliment).

The "nope" stands.






Its not rare and can be redefined. The definition of a "byte" is dot
fixed at a certain size. Get over it. Learn your lesson and move along.

You have never shown any evidence of this being the current standard, yet
you continue to claim the byte definition is arbitrary, and want to hurry to
change the subject now when there is nothing to support what you thought was
an arbitrary byte size


If they were selling it to be used in a 16 bit byte machine, sure (wrong
way, but I get your drift). Since they're advertizing it (and it's
formatted for) a system that uses 8 bit bytes they have to be consistent
with the size.

I bet it torques your jaws that they measure the size in
decimal too.

right, 4 bit byte would double it not 16. but decimal? not a problem
Six bit bytes were common. IIRC the CDC Ciber series had a 60bit
word and a 6bit byte. As has been noted IBM talks of a 10bit byte in one
of the I/O processors.

the CDC Cyber series, but is there any documentation on those old Seymore
designs? how can we verify the internal cpu stuff anymore? They got much
of the architecture from Univac some would argue

And since we are being detailed, the 10 bit byte in an IBM (a small
typewriter company that doesnt influence the computer world) uses an 8 bit
byte for data, and 2 bits for parity. Therefore that 10 bit byte IS an 8
bit byte, parity is not part of the byte data, its a wrapper

Nope. It' used by the ignorant to mean eight bits, or it's understood by
the context that it's eight bits. A byte is in no way defined as being
eight bits though.

Violence? No. Frustration with a pig-headed troll? Certainly.

Im here responding and taking smack from people and discussing the point,
hardly trollish. newbie to the forum, yes.
More than half that, but I was doing binary (and in all bases up to 32 -
got awkward above) in fifth grade, well more than forty years ago. You
simply sounded like a snot-nosed kid who thinks he knows everything.

Yeah, a lot of people say that.

How many bits are in a byte?

You're hopeless. I hope you're not involved in any engineering more
complicated than a toaster

I was trying to get work at a nuke u lar facility doing control systems.
seriously
Did I say anything about parity or prepresentations of binary numbers?
I'm talking about sizeof(byte) *not* being fixed at eight. It is
*usually* equal to eight, but if you assume that it's a fact it will come
back to byte someone. ;-)


sizeof() sounds very C-ish and in 100% of the time, is 8 bits wide, in
application. The only thing close to challenging this is a quote from a
$400 book that probably doesnt have pictures, that says byte is at least 8
bits.

Find anywhere in C code today that redefines sizeof(byte) as other than 8
bits that would suggest the byte isnt commonly 8 bits. I would design a
nuke u lar system controller thing and rest assured a byte is 8 bits and not
worry at all.

Sizeof(char) or ulong or ushort? They are all relative.
IBM? Yes. The official 'C' Standard. Yes. Life? Yes. Get over it,
you're wrong.

what does ibm ( a small typewriter company) have to do with the official C
standard?

I've proven your definition wrong by example. There are many
"microprocessors" that are *not* microprogrammed. There are many
microprogrammed processors that are *not* microprocessors. An IBM 360 is
hardly a microprocessor, though the "same architecture" some forty
years later, known as the z-Series are, in fact, microprocessors. Some
models are microprogrammed, some are hard-wired. The terms are othogonal.

Yet you've not shown one authoritative example of a VAX-11/780 being
called a "microprocessor".


The dispute was that some thought my calling it 11/780 cpu somehow implied
the entire system. Have you ever seen a real working 11/780? Ever taken
out the cpu board set?

its micro programmed, IIRC
They're afraid you couldn't learn the process.

That should be a challenge for an MBA type, to apply to McD as someone
needing a job, get hired, do well, make improvements, get some award etc and
not tell them about the MBA thing, do it for the exercise.


!
You can continue life being ignorant or you can learn. Your choice.

Im not from Missouri but you just have to show me. I think when we examine
the details of a web link we do not actually find a change in byte size
definition. I like to learn but need fact checking, not assuming something.

Somebody said that if "I" claimed the sky was blue they would double check
it? Good for you! Thats how it should be, not kissing somebodys back end
and assuming they can make decisions for you, learn to do the work yourself.

I know what an 026 is, and an 029. I used them in college, and a couple
of times since. 2629 to me is the model number of my laptop (ThinkPad
A21p).


My guess is that you're not talking about a ThinkPad.

No, the original punched card equipment

http://www.iso.org/iso/en/CatalogueDetailPage.CatalogueDetail?CSNUMBER=7606&scopelist=

I guess ISO 2629:1973 isnt used much anymore. It was one of those tactics
that ibm (a small typewriter comany) used to force equipment makers to adopt
one standard so they could get competitors to copy it, then ibm would change
and make the best equipment with a different standard, forcing smaller
companies out of business.



To clean up your messes, no doubt.

No, I mean people tend to be lazy and pawn off work onto others or spend the
time web surfing. If you can imagine, people get annoyed with me when Im
persistent and in their business asking for results at work. I drive people
to get results, irritating them like nails across the blackboard which I
find soothing to listen to. In past jobs co workers complained about me,
the managers wanted to promote me. doesnt make sense.
 
T

Thaas

Jan 1, 1970
0
The VAX had POLY: evaluate a polynomial. Most of the transcendental
math functions were microcoded, too.

John
In 1981 Modcomp's president went to a California design firm for the
design of their fully 32-bit machine, the 32/85. I remember the
section of firmware for the transcendental instructions that were
added to that machine were entitled "Transcendental Meditations" by
the programmer.

He was in a hurry for the project to end because he was scheduled to
attempt Mount Everest that year. Name of Igor Mamadlian IIRC.

Earlier machines implemented the Floating Point instructions on a
separate microcoded processor board. The I/O processors were also
separate processor boards. The Space Shuttle launch control system
uses Modcomp computers with extra customized CPU boards that are used
to communicate with other computers in the system through the 64KB
Common Data Buffer.

I designed the CPU of Modcomp's 9250, the gate array version of the
32/85 design. Reduced six 14.5" by 19" boards to one. The
Instruction Stream Processor was one 55K-gate standard cell. The
Memory Management Control used one gate array. The Data Cache
Controller was implemented in two identical gate arrays.

The WCS was implemented in SRAM SIPs as was the register block
storage which also held the constants for those trancendentals among
other things. All external to the ISP ASIC.

We never called any Modcomp minicomputer or CPU board or even the ISPs
of our gate array designs microprocessors. Gee, I guess I should
update my resume. According to Bradley I've been designing the damn
things for years! Shucks folks! I've designed microprocessors in
wire-wrap!
 
K

keith

Jan 1, 1970
0
Let me ask you a simple question, do you know what BiCMOS is? (not a trick
question)

Sure. We do a lot of it. What's BiCMOS got to do with definition
of "byte" or "microprocessor".

You have never shown any evidence of this being the current standard,
yet you continue to claim the byte definition is arbitrary, and want to
hurry to change the subject now when there is nothing to support what
you thought was an arbitrary byte size

You simply can't read. I gave you a link to a site that quotes the 'C'
standard clearly defining a byte as not necessarily being eight bits. Why
would the "sizeof(byte)" function be in 'C' if the answer were a constant?

tell you what... GO over to one of the "architecture" groups, like
COMP.ARCH, or COMP.LANG.C and propose that a byte is defined as eight
bits or ask why sizeof(byte) is needed since tha answer is *always* 8.
Be warned; wear your nomex panties.
right, 4 bit byte would double it not 16. but decimal? not a problem

Why, they're lieing, aren;t they? Clearly a gigabyte is 2^30 bytes, no?

the CDC Cyber series, but is there any documentation on those old
Seymore designs? how can we verify the internal cpu stuff anymore? They
got much of the architecture from Univac some would argue

Ask on alt.folklore.computers. The users, developers, and architects
hang out there. That doesn't change sizeof(byte) is not a constant.

Im here responding and taking smack from people and discussing the
point, hardly trollish. newbie to the forum, yes.

Well, the fact that you picked up on two wrong assertions in the same
thread and fight like a tasmanian devil to keep your ignorance, the
thought had crossed my mind. I've more than once looked down my throat
for a hook.
Yeah, a lot of people say that.

I understand completely.

I was trying to get work at a nuke u lar facility doing control systems.
seriously

OMG! And to think I'm a nuke proponent. Well, it's clear I can't
correct your ignorance, but you've certainly done the trick for me!
sizeof() sounds very C-ish and in 100% of the time, is 8 bits wide, in
application.

Wrong. Why would it be there if it was a constant? Hint: It's there
*because* the programmer cannot assume it for all ISAs.
The only thing close to challenging this is a quote from a
$400 book that probably doesnt have pictures, that says byte is at least
8 bits.

The 'C' starndard doesn't say it has to be any size, but must be big
enough to represent the character set, which in practice makes the lower
limit six.
Find anywhere in C code today that redefines sizeof(byte) as other than
8 bits that would suggest the byte isnt commonly 8 bits. I would design
a nuke u lar system controller thing and rest assured a byte is 8 bits
and not worry at all.

That really scares me! SOmeone *so* arrogant to contradict the standards
doing *anything* more critical than making toast.
Sizeof(char) or ulong or ushort? They are all relative.

Why are you changing subjects? We're talking "byte" not "char".
what does ibm ( a small typewriter company) have to do with the official
C standard?

Nothing (other than they have representatives on the various
standards committies). I gave two (not one) examples of where you're
wrong.
The dispute was that some thought my calling it 11/780 cpu somehow
implied the entire system. Have you ever seen a real working 11/780?
Ever taken out the cpu board set?

THe processor is not a microprocessor. It is made of several components,
this *not* a microprocessor. Yes, I was instrumental in purchasing an
11/780 in the mid 80s. I know a *liitle* about it. Tell you what, trott
on over to alt.folklore.computers and tell the good folks over there that
the VAX-11/780 was a microprocessor. Again, wear your nomex shorts.
its micro programmed, IIRC

Irrelevant. So were most of the IBM 360 line (only the /75 was
hard-wired), but they were *not* microprocessors by any accepted meaning.
That should be a challenge for an MBA type, to apply to McD as someone
needing a job, get hired, do well, make improvements, get some award etc
and not tell them about the MBA thing, do it for the exercise.

MBA types have shown that they can learn. What it is they learn may be in
question, but...
!

Im not from Missouri but you just have to show me. I think when we
examine the details of a web link we do not actually find a change in
byte size definition. I like to learn but need fact checking, not
assuming something.

You've been shown you're wrong, but will not see.
Somebody said that if "I" claimed the sky was blue they would double
check it? Good for you! Thats how it should be, not kissing somebodys
back end and assuming they can make decisions for you, learn to do the
work yourself.

If most here said the sky was blue we'd know form previous experience that
it was likely to be so. If you said it, we'd have to rethink our
experience and check it just to make sure. Is that clearer?
No, the original punched card equipment

http://www.iso.org/iso/en/CatalogueDetailPage.CatalogueDetail?CSNUMBER=7606&scopelist=

I guess ISO 2629:1973 isnt used much anymore. It was one of those
tactics that ibm (a small typewriter comany) used to force equipment
makers to adopt one standard so they could get competitors to copy it,
then ibm would change and make the best equipment with a different
standard, forcing smaller companies out of business.

That still doesn't change the definition of a "byte" or "microprocessor"
just to make you happy.If you're so interested in corporate "ethics" and other "dirty tricks",
why are you funding BillG and M$.
No, I mean people tend to be lazy and pawn off work onto others or spend
the time web surfing. If you can imagine, people get annoyed with me
when Im persistent and in their business asking for results at work. I
drive people to get results, irritating them like nails across the
blackboard which I find soothing to listen to. In past jobs co workers
complained about me, the managers wanted to promote me. doesnt make
sense.

Ah, so you're just a prick that turns in your cow-orkers to management.


BTW, a decent newsreader would be a good "investment". But at least your
not a top-poster, so there may be hope yet...
 
Top