Connect with us

8051 on-chip debugging

Discussion in 'Electronic Design' started by Schueler, Mar 25, 2011.

Scroll to continue with content
  1. Arlet Ottens

    Arlet Ottens Guest

    Some time ago, I compared GCC with ARM ADS 1.2, and found that the ARM
    compiler generated about 10% smaller code, which was significant enough
    to use it (I had a 4KB code limit for that project). This was using
    Thumb mode for ARM7.

    In the mean time, both the ARM and GCC compilers have improved, and I
    haven't done any more comparisons.

    As far as the other properties, I preferred the GCC compiler, due to its
    much better extensions, better assembler syntax, and more powerful
    linker scripts.

    Having to deal with the license server was also a royal pain when
    traveling with a laptop, and trying to get some work done without a good
    network connection.
     
  2. Chris H

    Chris H Guest

    Many commercial compilers in their licences do not permit the publishing
    of any benchmarks. I think it is particularly true of the American ones
    as the US has had a culture of advertising where direct (and usually
    negative) comparisons to competitors is permitted. Europe it is not
    common and in many places prohibited. This may explain why it has come
    about.

    However... most commercial compiler companies usually manage to get
    hold of all their competitors compilers and run various benchmarks. The
    results are obviously confidential.

    From what I have seen apart from the usual Drhy and Whet stones most run
    a lot of "real world" benchmarks of compiling large applications etc.

    The standard benchmarks are of little use unless you are building
    benchmarks. BTW from personal experience compiler companies do not
    specifically optimise compilers to be good at the dry/whet stone type
    benchmarks. There is little point.

    Many will have applications from some lead clients, with permission,
    that they have done tech support for. So there is a lot of real world
    application bench marking going on. It just does not get out into the
    public domain.

    They tend to do "real world" benchmarks because anything else is
    artificial and of little real use. This is because the bench marking is
    for internal use for the compiler development teams to see how their
    compiler stands up in reality to others. No one has customers writing
    Drhystones applications :)
     
  3. As ChrisH says, it's in the license conditions of the commercial
    compilers. The big players anyway.
    We used to have a guy on here who was extremely knowledegable about the
    ARM compiler libraries and the CM3, I think he wrote a lot of them. In
    any case he regularly trashed the gcc maths library implentations and I
    do tend to believe him, much as I like gcc.
    That's exactly what I do too but it seems like an uphill task in this
    case. There's an awful lot of cruft in there. Still not decided which
    way to go. I think I will likely use the core_cm3 files but ignore all
    the stm ones.

    What I usually find with these things is that there is a UART library
    say with a few dozen functions to set/reset/query every individual
    configuration bit, status flag and operating mode. And it still turns
    out to be just a lame polling driver rather than interrupt driven.

    Whereas in my own code it ends up being a couple of config register
    writes and off you go. I don't *care* about IrDA mode or whatever. Yes,
    the compiler will remove a lot of the unused stuff but its still there
    to confuse me when I go bug hunting.

    It's very nice to have it all available as a reference though.
     
  4. Chris H

    Chris H Guest

    Not that common... However there are several main players who do
    prohibit the publishing of benchmarks. So there is little point in the
    others publishing their own. Benchmarks are only of any use when
    comparing things.
    Also anyone will scream blue murder that the figures and or testing is
    fixed if they come out at the bottom or not at the top.
    However all the benchmarks I have seen (these are internal as they
    usually include compilers where you can't publish benchmarks) are
    normally using not just the standard benchmarks but also a number of
    large reference programs. Often from customers.
    This is very true. I always tell people to do this.
    Never going to happen.
    Well I have several sets of tests from 12 and 6 months ago. None of
    which I can publish or name of course as they are a confidential and b
    contain benchmarks for compilers you can't benchmark....

    It is increasing to see trends from one set to anther.
    Not according to the information I have.
    This is true. The smaller the MCU the more specific the compiler has to
    be.
    Yes it is a generic system not really suited to embedded systems
    That is not the reason. C99 was nto wanted or needed by the embedded
    community. Hence the reason why no mainstream embedded compiler supports
    C99 12 years on. BTW having a C99 switch does not mean it supports C99!

    As mentioned elsewhere the ISO C committee are moving to make some C99
    features "optional"
    It is not the quantity.
     
  5. Chris H

    Chris H Guest

    Not all of them.... tends to be a N. American thing.
    I thought there were various different sets of libraries for gcc?
     
  6. Well I am not at all an expert but yes, in principle. Here is my
    understanding:

    1) There are things that appear as "inline" code, like basic integer
    arithmetic and conversions between one integer type and another, These
    are actually the vast majority of many projects embedded code and what I
    mean when I say there is "little difference in compiler output".

    2) Then there are functions that cannot easily be expressed inline, like
    floating point versions of basic arithmetic functions (on an integer
    processor). I think these are in gcclib, part of the FSF gcc
    distribution. I believe there are a couple of implementations that in
    principle you can choose from when building gcc yourself. But the
    distribution will normally set this for you automatically according to
    the selected target.

    3) Then there are things like trig functions, these require explicit
    linking with a math library, for example the libm component of newlib.

    I think some of the commercial gcc vendors provide their own
    implementions of 3). Don't know about 2).
     
  7. The large 16 and 32 bit systems of yesteryear are the microcontrollers
    of today. The assumption of 16 and 32 bits ints, and efficient access
    via pointers and stack crippled 8 bit CPUs. But it works extremely well
    on an ARM say. Furthermore new parts are usually explicitly designed
    with C in mind. So C actually becomes a *better* fit as time goes
    by.

    And so, by extension, does gcc, even though it too was created for
    "large" (non-embedded) systems.

    [...]
     
  8. Nico Coesel

    Nico Coesel Guest

    There are. I alway use the C library which comes with MSPGCC (MSP430)
    for ARM projects. It is very small. If space constraints are really
    tight I switch to a small implementation of printf/sprintf.
     
  9. Chris H

    Chris H Guest

    Trouble is most of the world don't work that way.... :-( Also it is a
    moving target. Compilers evolve all the time.

    Besides many choose (initial buy) price over anything else anyway.
    No one trusts anyone to do that. They are also only going to shot off
    favourable benchmarks.

    Because the benchmarks I get to see on occasions are for internal use by
    the engineers they are very fair and impartial down to noting exactly
    which versions and settings for each compiler.

    The second you publish those some one will complain that the settings
    should be changed on their compiler for a particular test etc. (GCC
    people are some of the worst for this so no one bothers)
    Sorry I have to ask marketing/legal/corperate before I answer that....
    :)
    Well Keil tended to package most of the standard (drhy, whet, sieve)
    benchmarks with their compiler and they all fitted into the eval
    version. However the meaningful "real World" benchmarks are large
    amounts of real code and not something you can give away.
    OK... So you are going to give up a few weeks of your life to do it?
    Free of charge? Doing this is not a simple task.
    Dream on.... may be one day.... Perhaps a job for you when you have
    retired? If I am still sane I will give you a hand.
    Absolutely. This is why there is no test suite worthy of the name for
    GCC. It takes an enormous amount of disciplined and diligent work.
    Agreed. I know what I have seen but the minute I names names (or
    numbers) it is the last time I get any information on anything.
    That last comment is disingenuous.
    Absolutely. Always have done. There is no "best" compiler in a practical
    sense. A lot depends where you are coming from and where you are going
    to.
    Agreed. Support is another. Also what other tools it will work with and
    what functionality you need. For example it looks like the FDA (US
    medical authority) bless them have made up 30 years and now are likely
    to want not only static analysis but MCDC code coverage on the target.

    SO the tools you will need for that are very different to a 4.99GBP
    mass produced air freshener.
    I would disagree on that. The technology it uses is very old and not
    really suited to many MCU's

    They are getting there.
    I don't keep much of an eye on C++ at the moment (too busy)
    Tell me about it... Causes us a lot of fun in MISRA with "effectively
    bool"
    The problem with // comments is not the // but how the other end of the
    line was handled. However many if not most C90 compilers had that before
    the standard did.
    Yes.

    BTW I was arguing for the last decade that the ISO C panel should only
    add in and regularise things compilers were actually adding. On the
    grounds that if a feature really was required then compiler makers
    (commercial or otherwise) would add them as there was a real market.
    Hmmmm not sure there. I knwo a lot of people doing embedded systems with
    screens and multiple non A-z alphabets.... Chinese, Arab, cyrilc etc all
    on the same device and programming with non-European keyboards.
    Well I am never a fan of variable memory allocation, variable number of
    function parameters and variable length arrays in any embedded system.
    Better no. There are some good ones doing Open Source. But then again
    there are some appalling ones. Everyone plays. The commercial (closed
    shop) development teams can keep a standard and have a much more
    controlled process.
    That is true. However it is like playing bridge where the Gcc is the
    dummy hand. That is open. So the commercial people can see what is
    happening in the GCC world but not the other way around.

    I would say most do. The reason is simply the practicality of running a
    large project. The commercial teams have a very rigorous system that
    must be used and they are in full control of it. This in itself makes a
    big difference.
     
  10. Chris H

    Chris H Guest

    They run the same tests on all of them. Then they look at the results.
    The benchmarks are fair and impartial it is pointless doing them
    otherwise.
    That is why they run the benchmarks.

    Really? Then you are talking to different development teams to me then.
    There aren't any... there is where we came in. A lot of compilers do not
    permit the publishing of benchmarks
    Then ALL benchmarks are pointless.
    The Engineers don't release them. Why would they?

    Me neither but it would be good to have a go.
    OK. I'll buy.
    Neither do I in this discussion.
    The problem is the commercial compilers are not going to give any hints
    to the GCC people....
    It depends which targets you are talking about.
    :) It is guidance not a religion.
    Maybe. What would you use in it's place?
    Yes.... If a tool vendor can not implement a "cool" idea there is not
    point in having it.
    Ask the Chinese, Russians etc
    It depends where on the planet you are sitting.

    Apart from APL I thought all computer languages were written in ASCII.
    I wonder what will happen when the Chinese invent a computer
    language.....
    That is true. :)
     
  11. I have used the "generic" approach, it all sounds great in theory but I
    am not sure it added much for me in the end. One thing that has worked
    well so far is a set of generic "port I/O" macros that abstracts single
    pin I/O operations. So I can write some bit-banged chip driver once for
    multiple MCU families. And I have various hardware-independent libraries
    for CRCs, bit manipulation, graphics, fonts and so forth that are
    invaluable. But "hardware-independent hardware drivers" - not so
    much. It is all so interrelated, and on a modern microcontroller there
    are so many options 95% of which I will never use. It is not worth
    spending time writing functions for every possible feature "just in case".
     
  12. Nico Coesel

    Nico Coesel Guest

    I'm under the same impression. A commercial tool always has a limited
    budget and therefore the quality in amount of bugs and innovations of
    the solution is limited to what is economically viable with the
    workforce at hand. Opensource has much less problems with these
    limits. If a project is succesfull, better programmers will step in.
    If not, development will halt. Kind'a like evolution.

    OpenOCD is a nice example. I needed faster loading for MIPS platforms
    so I optimized the MIPS support by approx 30%. Someone else came along
    and he optimized it even further.
     
  13. Nico Coesel

    Nico Coesel Guest

    I rather do that as I go along. Sometimes you're lucky. I like the way
    the headers files are organized for NXP's LPC2000 series. Except for
    some inconsequent naming conventions between different family members.
    However, for the LPC1000 (Cortex) series they used structs which I
    don't like (too much can go wrong) and it killed existing code. I
    ended up converting most of the LCP2000 headers to LPC1000.
    The cmsis library is not bad as it is. Its one of the very few times I
    actually kept code which came from NXP's website. Keil wrote a lot of
    generic drivers for the LPC2000 series but its all a bunch of
    incomplete, overcomplicated and flaky crap.
     
  14. Chris H

    Chris H Guest

    In message <
    That I don't believe. The Keil system supports virtually all the 8051's
    out there. It's performance as a compiler I have not seen beaten
    anywhere. It's Simulator is generally accepted as the best in the
    business.
    Yes it costs less.
    What limitations?
     
  15. Chris H

    Chris H Guest

    From being involved in independent verification of compilers. Yes.
    We were discussing compilers.
    Then your definitions are wrong :)
     
  16. Your suggesting in the background that a commercial project is better
    in rigorous control. A dogma.

    This is simply not true by definition. Have you ever looked into
    Debian? That may be the largest project in the world barring
    military ones. I've never worked in any commercial environment ever,
    with such tight effective control as Debian.
    Bureaucratic burdens, yes. Effective quality, no.

    Groetjes Albert
     
  17. Rich Grise

    Rich Grise Guest

    I first got turned on to Linux in the mid-1990's. I was looking around
    at various distros, and finally settled on Slackware, because I liked
    the name - hey, I'm a Slacker, it's only appropriate, right? ;-P

    I've been 100% satisfied ever since.

    Cheers!
    Rich
     
  18. Schueler

    Schueler Guest

    Dear all,

    thanks so far for contributing to this fruitful discussion. I finally bought
    the
    MDE 8051 Trainer Board populated with the MAXIM DS89C4xx 8051 derivate. It
    features the ability to being programmed via HyperTerminal by means of a hex
    file. Since this is my first encounter with microcontrollers, let me ask you
    for advice how to generate a hex file for this device. Is there any freeware
    or open source tool for this purpose of how should I proceed?
    Thank you.
     
  19. Rich Webb

    Rich Webb Guest

    [Please don't top-post.]

    The Small Device C Compiler (aka sdcc) is free open-source software and
    is reasonable to start out with. You're probably best to begin poking
    around with getting your first "Hello, world!" program (i.e., get a
    selected pin to toggle at a selected rate) in assembler so that you get
    accustomed to the 8051 architecture before moving on to C programming.

    http://sdcc.sourceforge.net/
     
  20. Chris H

    Chris H Guest

    I have used this one and it supports only the mainstream 8051's The
    Dallas parts are not mainstream and use a non standard core with
    additional non standard SFR's

    SDCC does not generate good or sensible code. Last time we had this
    discussion and it got heated someone (no names to protect the guilty)
    who was arguing very strongly for the SDCC was going to produce a
    comparison between the two. Some two years later and we are still
    waiting....

    Keil used to recommend the Dunfield 8051 compiler for hobby users with
    no money.

    There is the option of course of the Keil evaluation compiler. The
    biggest problem with the 8051 is the very small amount of RAM. You
    usually run out of this first rather than program memory. The Keil uses
    a rigorous calling tree analysis to do data overlaying. Something you
    should not do manually! I have seen 120 bytes of data shrink to 19 bytes
    in the Keil compiler.

    It also comes with a large number of example programs.

    The other point of the Keil system is the extremely good simulator
    debugger. It will also mate with the www.labcenter.com PCB design
    software which also has a spice simulator. SO you can actually simulate
    the hardware and software. The Lancenter SW is not expensive.


    NOTE it is worth looking around at come of the 8051 dev kits as the
    limit on their eval Keil compilers is twice the normal one.
    Ouch! Be fair the 8051 is very cost effective for a lot of things. Not
    everything warrants a PPC :)
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-