Connect with us

Heathkit ET 3400

Discussion in 'General Electronics Discussion' started by JRUBIN, Sep 30, 2017.

  1. JRUBIN

    JRUBIN

    95
    15
    Jul 17, 2015
    I picked up a Heathkit ET3400 for next to nothing this week. it is a bare bones microprocessor trainer.
    The test programs show everything seems to be functioning though i have seen that if you crash it hard, the ram changes value in the zero page area... not cool....


    [​IMG]
     
    hevans1944 likes this.
  2. hevans1944

    hevans1944 Hop - AC8NS

    4,160
    1,971
    Jun 21, 2012
    Good find! Now go build a nice shadow-box, with a glass cover to keep the dust out, and mount this ancient treasure on a wall in your living room or man cave. Go buy some modern microprocessors and learn how to program and use them. You will learn nothing useful playing around with this ancient beast. You had to have been there to have done that. It's waaay too late now. Embrace the change. There have been some wunnerful improvements since your trainer was new.
     
    davenn likes this.
  3. JRUBIN

    JRUBIN

    95
    15
    Jul 17, 2015
    I learned on the 6502. Wrote software for the Commodore 64 and the Apple II. Actually still have a few project going at this time mostly dealing with ACIA. So it wasn't really a stretch to pick something like this up as a novelty.
     
    hevans1944 likes this.
  4. hevans1944

    hevans1944 Hop - AC8NS

    4,160
    1,971
    Jun 21, 2012
    As I said: Good Find!

    Some of my fellow technicians in the 1970s bought KIM-1 6502-based processors to learn with. I had better financing and support from management, so I was able to purchase an Intel Microprocessor Development System to develop 8080 software. Later, after I graduated in 1978 with a BEE degree, I went to work for a small company that wanted me to develop a field-portable data acquisition system based on the 8085. Turns out they had an Intel MDS too, but if not, I would have ordered one.

    I took over a partially finished upgrade of an existing system already in the field that used an audio tape recorder to record 110 baud acoustic modem tones, the so-called Kansas City Standard. They would spend a day recording digital data in the field on audio tapes using a modem, then play the tape back through an acoustical coupler for transmission to the "mother ship" in Beavercreek, OH, for analysis. This was all part of a "ground truth" program used to calibrate highly classified overhead imaging systems.

    I worked on the ground truth part of it for several years before gaining the proper "tickets" to be "read in" to the overall program. Fascinating stuff, but once you learn about it, you live in fear of accidentally "spilling the beans" in idle conversation and going to prison for it. Well, at least I did. I didn't like knowing, and having to protect, those kinds of secrets. But the engineer who was designing the upgrade to their ground truth data acquisition system was faced with a dilemma: he could either divest his business interest in a small electronics company that did classified contract work for the Government, or he could leave the employ of the company I had just signed on to work for.

    I didn't learn about his hard choice until after I was employed, whereupon I he told me that I was replacing him because he would not give up his interest in what was considered to be a competitor. Turns out, he had already selected an 8085 microprocessor system and a digital tape recorder to replace the TTL circuitry and analog audio recorder. Some hardware and software integration was still required, and that task was mine to complete. IIRC, it took about a year to get everything up and working, but I very much enjoyed learning about digital tape recording and how to interface the tape deck to the microprocessor. Once we got it deployed into the field, reliability of data transmissions to the mother ship was vastly improved.

    Even with poor telephone lines, typical of cheap motel rooms located in the middle of nowhere, the transmission protocol allowed for error correction of data frames. Management was happy with it, but I didn't have the heart to tell them it was already obsolete. If management had their way, the data acquisition package would have been based on a Digital Equipment Corporation minicomputer... maybe a PDP-11/03 with a Winchester hard drive, a floppy drive, and an RT-11 real-time operating system. It would have been four times the volume and easily ten times the weight of what was eventually fielded in a custom-made, hermetically sealed, fiberglass backpack carrying about ten pounds of sealed lead-acid batteries for field power.

    This was a DEC shop until the bitter end, but this Intel 8085 project provided a wedge that allowed me sell other Intel embedded microprocessor projects to management, including embedded IBM PCs, albeit with fierce resistance from the entrenched software weenies who wanted nothing to do with what they called "toy computers". Too bad for them, because my oldest son grew up learning how to write programs for Commodore 64 personal computers before moving on to "big iron" at Ohio State. And by the time he graduated from The Ohio State University with a EE degree, PCs were a done deal and you couldn't graduate without one. DEC went bankrupt and "out of business," but I had left the company that thought DEC would around forever long before that happened.

    I understand many of my fellow employees migrated to another company that does classified contract work. That makes a lot of sense because acquiring security clearances is an expensive process. Unless you actively work for a company that requires your clearance, it becomes inactive. If enough time passes, and you fail to keep good records of what you did, where you did it, when you did it, and with who you did it, it is virtually impossible to re-instate your clearance because it is cheaper to hire a young person who has not as much history to examine. Employers pay for the cost of obtaining clearances for their employees, so they aren't exactly handed out like jelly beans.

    I never really had time to learn about any of the other microprocessors that sprang up overnight in the 1970s. Things like the Zilog Z80, the Motorola 6800 and 68000, and dozens of others that enjoyed their fifteen minutes of fame before disappearing from the landscape. It was a heady time to be involved in microprocessors, and most of us had vision that upper-level management didn't have. But even being deeply involved doesn't mean you can see the forest for trees. I totally missed (and argued against) client-server based networked systems. In the beginning, everything on a client-server network was supposed to be a "dumb" terminal... just a mouse, keyboard, and monitor presented to the user. But PCs advanced so rapidly, and became so powerful and self-sufficient, that the client-server model didn't hold up very well. People wanted to do "other things" at their "terminal."

    Then the Internet and Internet bandwidth exploded, allowing stuff that would previously execute on a PC to now execute on blade servers "in the cloud". Sure, you can afford a few terrabytes of local storage now, but how the heck are you gonna back it up? Having storage in "the cloud" moves responsibility for protecting that information from the end-user to the cloud server. So the wheel comes around again, and ideas that had traction in the 1990s have re-gained that traction by simply re-branding them as "cloud computing" instead of client-server networking. And nowhere is a "dumb terminal" to be seen.

    If I were you, I would proudly display that training tool in a shadow box with an extension cord, and maybe breadboard a simple demo "circuit of the month" to show off to visiting firemen. It's nice to see that "ancient technology" has not been totally forgotten, and maybe still has a place in the 21st Century. Thank you for posting here.
     
  5. bushtech

    bushtech

    872
    139
    Sep 13, 2016
    Thank you hevans1944. Fascinating stuff!
     
  6. JRUBIN

    JRUBIN

    95
    15
    Jul 17, 2015
    Here's a video with a demonstration of the trials and tribulations of the unit

     
    hevans1944 and Samrt like this.
  7. Samrt

    Samrt

    2
    0
    Jul 2, 2019
    I beg to differ with hevans1944. The Motorola 68XXX (successor to this 6800) is still very alive and well, especially in very high-speed processing applications, including the launch computer used by NASA for all rockets leaving their launch pads. This includes SpaceX rockets.

    The military is also a big user of this processor and I suspect that there is at least one in use at CERN as part of their recent reflective memory hardware upgrade to enhance their beam control system.

    Why am I so sure? The launch monitor computer used by NASA to monitor all the mission critical sensors on ANY rocket on EVERY launch pad is a VME rack. Every VME computer rack will always have a "bus master" computer (usually in slot 1 by default at power up). This bus master computer controls the whole shooting match, as it is in charge of all data on the bus -- it "owns" the bus.

    This bus master computer is almost always a Motorola 68XXX processor for several reasons, not the least of which is that the whole VME architecture was designed around that chip.

    The fact is that for rack applications where multiple computers are processing information simultaneously and sharing data with each other on a common backplane bus such as the VME (and actually many other rack systems such as Allen Bradley PLCs, Reliance Coordinated Motion Control Systems and DCS, etc.), the Motorola family of processors are the best fit to be the bus master computer.

    This is not only for the tri-state bus configuration that allows for multiple processors to take over bus control (when granted) without conflicts, but also for it's ability to "micro-channel" parallel bus operations in a manner such that any of the bits can be configured as either inputs or outputs and data can be passed not only bi-directionally, but done so on the very same clock pulse. Whether a particular bit presented on the bus is an input or an output depends on how the "Data Direction Register" is loaded at the time a bus transfer occurs. This was unique to Motorola at the time because all other processors required that all the bits on a parallel bus be either inputs or outputs at the same time during the same clock pulse.

    This "micro-channeling" scheme was so unique to the 68XX and 68XXX that onit is one of the biggest reasons Motorola jumped ahead of all the other processors (Intel, Zilog, etc.) as the processor of choice when it came to industrial control hardware (PLCs and DCS), NASA launch computers and military applications.
     
  8. Samrt

    Samrt

    2
    0
    Jul 2, 2019
    Article on use of VMEbus in DoD applications (fighter jets, missiles, etc.)
    https://issuu.com/rtcgroup/docs/cots1201/64

    NASA grounding standards for VME racks.
    https://www.google.com/url?sa=t&sou...FjAAegQIAhAB&usg=AOvVaw0wyrK6QAQsOlwOBv9Cycy7

    Also, SpaceX had to abort a launch from a NASA pad last summer. The abort condition was originally reported by NASA as simply "VME fault" because the alarm condition was not "mapped" all the way through to the screen display computer. That fault turned out to be a low pressure on a nitrogen sensor if I recall correctly, but the system is designed in a manor where when a sensor input is out of range it generates an interrupt request on one of the seven IRQ lines on the backplane of the VME rack.

    Finally, this article explains the history of the development of the VMEbus rack system used heavily by NASA and DoD as shown in the info above, as well as how tightly the design of the VMEbus architecture is tied to the Motorola 68000 chip design. It explains clearly why the 68000 is the chip of choice for the bus arbitration scheme used on the VMEbus. I have purchased many "bus master" computers for VME buses in my lifetime and I assure you that 99.99% of them use at least one MC68XXX processor, just in case there is any doubt.
    https://en.wikipedia.org/wiki/VMEbus
     

    Attached Files:

    Last edited by a moderator: Jul 2, 2019
  9. AnalogKid

    AnalogKid

    2,285
    647
    Jun 10, 2015
    So do I, with both of you.
    No, it isn't, for several reasons, not the least of which is that the majority of VME SBC's are Intel-based.

    Yes, VME grew out of the Motorola Exorcisor development system of the late 70's. But while it was away at finishing school in Europe, the control structure became less 6800-specific, and less 8-bit specific. And once IBM anointed the Intel x86 over the 68xx as their processor of choice, everything changed.

    Actually, that is opinion, not fact. And I think you are minimizing the roles the software development environment and operating system play in overall system performance and reliability.

    In VME and VXI, the only thing that must be on the slot one card is the bus arbiter, because of the daisy-chain nature of the arbitration logic. This circuit handles interrupt requests and bus arbitration (deciding which board is controlling bus transactions), and is completely independent of the local CPU.

    I've done VME and VXI systems for industrial and MIL customers (including a 20-slot, 9U, 400 mm beast with four Power PC cpu's per board, 16 boards, 1000 A at 5 V). Like you I've designed/modified/upgraded a lot of VME systems, but my experience is the exact opposite. From what I've seen, the industry uses Intel SBCs more than Motorola, even though there is no question but that the Motorola processors are technically superior.

    It's been a while since I've paid any attention to market data. Do you have recent market share numbers for 6U / 160 mm SBC's?

    ak
     
    Samrt likes this.
  10. hevans1944

    hevans1944 Hop - AC8NS

    4,160
    1,971
    Jun 21, 2012

    So how do either of you "beg to differ" with what I posted? This thread was initially about the discovery of an obsolete trainer in very good to excellent condition. It was used in the previous century during the decade of the 1970s to learn about the obsolete Motorola 6800 8-bit microprocessor. Heady times those were!

    There were two camps. In one camp were those who thought that a "computer on a chip" was a mere novelty and would never compete with "real" computers. In the other camp there were (mostly young) engineers and technicians who saw the potential and dived in to learn as much as possible about how they worked, how to use them, and how to program them. These early adopters were not white-coated priests, hidden behind glass-walled air-conditioned rooms, seemingly holding everyone hostage to the gods of Computer Science if they dared to mention they wanted to use a computer. Oh, no! These early adopters were practical, hands-on, make-it-do-something types of people. The closest most of them came to programming was possibly the use of an interpreted BASIC running on a time-shared university mainframe. The trainer came with a whopping 256 bytes of RAM and 1024 bytes of ROM, used to operate a primitive keyboard and character display. Programs were entered manually, one byte at a time, as machine code.

    The Motorola 68XXX (successor to the 6800) was not even a subject of the thread discussion, nor was this processor available when the training kit was offered to the general public. I suggested to @JRUBIN that the trainer be showcased as one example of where microprocessor technology began in the 1970s. So maybe NASA has moved on since then... who gives a fig? Both of you should visit Jordan Rubin's websites to see who you are dealing with. I suspect he and I have gone down similar paths, and although I never (as an Air Force brat) had the opportunity to collect as much "stuff" as Jordan has accumulated, we do appear to share similar interests.

    So go ahead and hijack this thread and pretend it's all about Motorola 68000-series processors, and VME data acquistion and control systems, but you better keep an eye peeled for what National Instruments, among others, are doing. SpaceX and NASA are both jokes IMHO. Where is the USA Moon Colony with regularly scheduled flights between the Earth and the Moon? Is there nothing valuable on the Moon worth the effort to develop and mine it? Where is the USA Mars Mission? Where is a published national space policy, outlining objectives and a clear direction forward? Does anyone think another President is gonna make a bold statement about explorers landing on and returning from Mars, or any other planetary body in our solar system by the end of the 22nd Century? Perhaps we are all waiting around for Alien Technology to get things done for us.

    I think what Jordan Rubin found and posted images about was simply wonderful! It sure brought back some fond memories.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-