Connect with us

Parallel Processing Progress

Discussion in 'Electronic Design' started by [email protected], May 2, 2006.

Scroll to continue with content
  1. Guest

    Why has Parallel Processing started to take off in the last few years?
    I remember all of the hoopla around 15 or 20 years ago about what an
    impact it was going to have. Then interest subsided some what as there
    were claims of difficulty with software & programming,

    Now it has made a comeback, I have read of near miraculous applications
    & it should deliver great advances with AI innovations. So I ask again,
    why is it finally starting to work out?

    Joel
     
  2. Guest

    * Cheap massproduced cpus.
    * FPGA is cheap and a workable solution with a standard language.
    * There's some physical limits on serial data processing..
    (I expect 20 GHz cpu to see the daylight in less than 15 years thoe)
    * Improvement in programming methods?

    Cost ratio of fast cpus vs many cheap cpus.
     
  3. Not to mention improved techniques to interface them.

    I read somewhere the the huge problem with parallel processing is not the
    hardware but the software(the programming). But I already read somewhere
    recently that someone(maybe AMD) figured out a way to get around this. That
    is, instead of leaving it up to the programmers to have to handle the
    parallelization one could code for a single processor and it will run
    efficiently on any number of processors without any code change.
     
  4. Phil Hobbs

    Phil Hobbs Guest

    Yeah, parallel folk always say that. It's like the 100-mpg carburetor
    that keeps turning up, even though nothing bigger than a lawnmower has
    used a carburetor in a looong time.

    Efficient parallel algorithms are hard to make. In a given program, you
    may have lots of things that parallelize well, e.g. some types of loops,
    and still be unable to use all that horsepower. Amdahl's Law is highly
    relevant: if x% of your job has to be done serially, and 100-x% can be
    done in parallel, then no amount of parallelism can speed the job up by
    more than 100%/x. Building parallel algorithms is partly an exercise in
    reducing x.

    None of this is to say that multicore CPUs are bad--if your OS is
    running 100 threads, on a 100-core machine they can each have a whole
    core. Thus, for example, IE can freeze up 3 cores rendering eye candy,
    and you can still play Pong on the other 97 simultaneously.

    Cheers,

    Phil Hobbs
     
  5. In other words, tools.

    Back in the early day, the tools (and platforms) were expensive, not
    many people could afford to play around with them and advance the field.
    As the h/w got cheaper, more people got access to it and were able to do
    R&D, build the required tools and optimize future hardware development.

    The typical CS major in college today won't have too much trouble
    getting some time on the department's Beowulf cluster (if they don't
    lash one together in their dorm room).
     
  6. Mochuelo

    Mochuelo Guest

    IMO, they have almost reached the performance limits of the current
    technology (the cow "reduce size, decrease voltage and increase clock
    frequency" won't give much more milk) and now they are _forced_ to
    come up with really smarter ideas to keep progressing.

    Best,
     
  7. I well remember that hoopla and the nothing coming of it. Remember also at
    that time the aquiring of some small wisdoms that have held good since.
    1]All announcements from the AI crowd are worthless and to be ignored.
    2]All programmers talk about parallel processing is to be ignored.
    3]All mentions by marketing people of 'near miracles' are to be ignored.
    4]General purpose parrallel processing is nice to think of but is actually
    impossible.

    That just leaves what little parallel processing there is, in the realms of
    massive number crunching (ie weather forecasts, bomb detonations etc), where
    each processor can work happily as one row/column element of a mathematical
    matrix.
    Nowadays much cheaper and simpler but nothing new!.
    john
     
  8. Tim Williams

    Tim Williams Guest

    ....
    5] String theory :p (So I've been told..)

    Tim
     
  9. Ken Smith

    Ken Smith Guest

    Are we sure it really has? The technical press can have the "slow news
    day" effect like the popular press does.

    The machine I'm typing this on is "multiple cored" but I wouldn't call it
    a real "parallel processor". The two CPUs are doing very different
    things.

    Windows NT, is supposed to run on multiple processors. Again this isn't
    really parallel processing. One processor is used to make that stupid
    "Clippy" blink at you while another is graphically rendering the BSOD.
    Many problems are very hard to spread across multiple CPUs. Others like
    doing optics "ray tracing" adn predicting air flow over a wing, are a
    natural for it. Parallel processors have been in steady use for that sort
    of job since the 1970s at least.

    I've read that we will all have flying atomic powered cars by the year
    2000. I look around and see that it didn't come true.

    Prediction is hard, very hard if it is about the future.

    That said, the main places I see it making advances are where it can
    extend from areas where it already has a foot hold. I expect that we will
    see it starting to show up in the high end of video games, CAD work
    stations and the like.
    Parallel processors will just allow a bigger and bigger hammer to be used
    for AI. It won't suddenly cause all the programmers to invent
    screwdrivers and wrenches.
    Again, maybe it isn't. Maybe some reporters got bored.
     
  10. Ken Smith

    Ken Smith Guest

    How about everyone is learning APL? There's a language that could use
    arrays of ALUs.
     
  11. We thought that Moore's law would come to an end earlier. Maybe
    now we're getting close though, as some have commented.

    One of the main drivers though is the increasing trend to server
    virtualization, and the needs of specific-purpose applications,
    especially database engines, that are economically important
    enough to warrant the (huge) extra cost of software development.

    I don't think that mainstream software development is even getting
    close to writing parallel applications however.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-