How to Interface an MAX1204 8 channel 10 bit ADC

Control Systems & Robotics Topics
Post Reply
User avatar
Neo
Site Admin
Site Admin
Posts: 2642
Joined: Wed Jul 15, 2009 2:07 am
Location: Colombo

How to Interface an MAX1204 8 channel 10 bit ADC

Post by Neo » Sun Nov 29, 2009 4:37 pm

MAXIM makes a variety of analogue to digital conversion IC’s. They also have a very liberal and easy sample policy. The microprocessor I used on Dilbert, the ATMEL 8515 AVR, does not have built in analogue capabilities, so I choose the MAX 1204, an 8 channel, 10bit Analogue to Digital Converter (ADC). The MAX 1202 is an identical product with12 bits of resolution. In fact, the software needed is identical for both; with the 1204 one simply discards the lowest two bits of data.

This paper will make a lot more sense if you have the MAX1204 data sheet in front of you.

The 1204 uses a four wire SPI interface. The 8515 processor has SPI hardware support built in and using it would have been fast with minimum software overhead. However, I decided to use a software bit-twiddling approach, since it had the overwhelming advantage of not requiring me to rewire my board or reworking existing software that already used the SPI pins. I simply found three unused I/O pins and used them.

I ignored the Chip Select because there was only one chip on my board. One could ignore the chip select with multiple chips, but you would need to have separate clock or MOSI lines for each chip since the rising edge of the clock with MOSI high and CS low is what triggers the chip. Of course, one could have the three SPI lines go to every chip and use the CS line to control which chip is being addressed. The latter would be required if one had a mix of SPI peripherals on board.
Hardware Interface considerations

Figure 1, below, is from the MAXIM web site and shows the typical connection between a 3v processor and the MAX1204 chip. Since my robot has +5V logic (74HCXXX) I simply tied the logic level sense to the +5V supply.

10 bits is pretty high resolution. To avoid digital noise on the analogue signals, I added a separate +5V supply (78L05) devoted just to the ADC and the photodiodes used as inputs. The ground for all of the above was tied into one point where the power came into the regulator. With minimal bypass capacitors on the ADC inputs I easily get stable readings. The other five inputs, connected to Dilbert’s floor sensors, are not bypassed and are driven from the +5V logic supply and there is one bit of dither when reading them. Since I lop off the floors to 8 bits, it turns out not to be important. Still, one bit of dither from the Logic supply indicates to me that the separate analogue supply wasn’t really needed in my design.
maxim_1.gif
maxim_1.gif (4.59 KiB) Viewed 3285 times
Software considerations
Conversion is initiated when a control byte is shifted into the chip. With more careful reading of the specification it turns out that conversion is actually initiated when the first 1 bit is clocked into the chip with the chip selected. Since in my design I tied CS low and the chip is always selected, conversion is initiated when SCK has a positive transition with MOSI = 1. The next 7 clocks are used to shift in the control word, which consists of three channel select bits and four other control bits. In the code example, below, I initialize a byte with $8F (1000,1111), replace the “000” with my channel select bits (0-7) and then shift it left out the MOSI I/O bit on the micro-controller. It is important to note that in order to not cancel the current conversion and start a new one, MOSI must be left at the 0 state for all other clock cycles. This is noted in the source code.

The next clock cycle, after the last bit of the control byte has been shifted in, will shift out the Most Significant Bit (MSB) of the results on the MISO line. 11 more cycles shift out the rest of the conversion results. Since only 10 bits are valid, I stop shifting out data after 10 bits. This leaves the chip waiting to unload the “guard” bits, but that is Ok since the start of the next cycle causes the previous conversion results to be discarded. I implemented this by filling my results word with $FFC0 (1111,1111,1100,0000), which is 10 “1”s followed by 6 “0”s. Then I simply left shift in the data until there is no carry bit (e.g. the first 0 bit is shifted out).

It should be noted that the discarded bits seem to be valid, but are not guaranteed.. If your application needs high-resolution results but does not care about accuracy, the 1204 could be used as a 12 bit ADC.

Selecting a channel
The MAX1204 can be operated in several different modes. The mode I chose was single ended, which means each line simply measures from ground to +Vref and returns the results from $000 to $3FF. Other modes are differential and bi-polar measurements. I think that the channel select was implemented to make differential measurements easy. There are only four channels, in this case. However, it made selecting a channel difficult for single ended since the bit pattern was not binary. That is easy enough to deal with: simply re-name the channels on the chip. However, that isn’t what I did. I ended up re-arranging the bits of the channel so that binary 0-7 would select inputs 0-7 (pins 1-8) on the chip. Thus one could simply use the chip w/o having to translate which inputs would be selected for each channel.
Other details

The following code is written to the IAR AVR assembler. The instructions “BeginCritical” and “EndCriticalReturn” are simply aliases (macro’s) for the CLI and RETI instructions.

Code: Select all

;+
;----------------------------------------------
; ADCReadChannel
;
; MAX1204 Driver
;
; Passed:   Wl (R24)= Channel
; Returned: Yh:Yl (R29:R28)  = 10 bit results
; Uses:     Flags
; Stack:    none
; Notes:    - Disables interrupts for ~163 cycles (20us @ 8mHz)
;           - Assumes CS is always zero: just uses SCLK, Vin and Vout
;           - Doesn't clock in Guard bits, these get clobbered when
;           the next conversion is started
;
; Channel Assignment:
;       0       = CH0 (Pin 1)
;       1       = CH
;
; Note: Port data direction must be set elsewhere (user init code)
; The following macros are used to manipulate the control lines.
;
#define ADC_SCLK        PORTA, PA0    // Output
#define ADC_DIN         PORTA, PA1    // Output
#define ADC_DOUT        PINA, PA2     // Input
;-

ADCReadChannel:
      ldi     Yl, 0x8F        ; Unipolar, single ended, no-power down
      sbrc    Wl, 0           ; Rearrange bits so binary # selects
      sbr     Yl, 1<<6        ; correct channel
      sbrc    Wl, 1
      sbr     Yl, 1<<4
      sbrc    Wl, 2
      sbr     Yl, 1<<5
      ldi     Yh, 8           ; Loop count to shift out control byte

      BeginCritical           ; CLI

NextControlBit:
      sbrc    Yl, 7           ; 8 * 9 = 72 cycles
      sbi     ADC_DIN

      sbi     ADC_SCLK
      lsl     Yl
      cbi     ADC_DIN         ; leave DIN cleared since conversion is triggered by a 1 on this line.

      cbi     ADC_SCLK
      dec     Yh
      brne    NextControlBit

      ldi     Yh, 0xFF        ; Yh:Yl = 0xFFC0 (10 bits to shift in)
      ldi     Yl, 0xC0        ; Loop, below will clock in the 10+1 bits

NextDataBit:
      sbi     ADC_SCLK        ; 11 * 8 = 88 cycles
      lsl     Yl              ; Shift Y left, Y(0) = 0
      rol     Yh
      cbi     ADC_SCLK        ; drop clock
      sbic    ADC_DOUT        ; inc Y(0) if DOUT = 1
      inc     Yl              ; (inc doesn’t affect Carry)
      brcs    NextDataBit     ; Loop until 0 shifted out

       EndCriticalReturn       ; = 73+88+2=163 cycles till SEI
Courtesy: Larry Barello
Post Reply

Return to “Control Systems & Robotics”