Kristian
Kristian

Reputation: 133

I2C bizarre delay issue when reading

I've been trying to get my ATTINY85 to bit-bang I2C (read/write). I have the following configuration:

PB0 = SDA
PB1 = LED
PB2 = SCL

I'm able to write without any problems, but reading only works if I have my 'delay()' function inside the read loop, so far so good:

char i2c_read(void)
{
    uint8_t B = 0;
    DDRB &= 0b11111110; // switch PB0 to input

    for ( int bit = 0; bit < 0x08; bit++ )
    {        
        delay(); // <--!!!!!!!!! the root of all evil

        SIGNAL_HIGH( PORT_SCL );

        B <<= 1;

        if( PINB & (1 << PB0 ) )
        {
            B |= 1;              
        } 
        else
        {
            B |= 0;              
        }

        SIGNAL_LOW( PORT_SCL );
    }

    DDRB |= 0b00000001; // switch PB0 as output

    i2c_nack();

    return B;
}

If I remove the delay(), I2C no longer works and I cannot read from the device (device doesn't respond). Seems logical, but the reason I want to remove the delay() is because it's not actually an 'true' delay, it simply turns on and off an LED which is on a different pin (PB1), I2C lines are on PB0 and PB2.

The _delay_ms was too slow, so I just turned PB1 pin on and off in order to make a tiny delay and that's the only way it works. Here are the contents of my delay function, everything works great if I leave it like this:

void delay()
{
    LED_ON();
    LED_OFF();
}

void LED_ON( void )
{
    PORTB |= 0b00000010; // PB1
}

void LED_OFF( void )
{
    PORTB &= 0b11111101; // PB1
}

I suspected that I probably 'nailed' a perfect delay which creates the appropriate signal length expected by the other device, so I tried to make the same delay using the for loop and oscilloscope:

void delay()
{
   for( int i=0; i<20; i++){ }
}

No luck, I2C reading stops working..

Then I decided to switch the LED to another PIN and leave the PB1 completely alone to see if it's delay related or pin/circuit related:

void delay()
{
    LED_ON();
    LED_OFF();
}

void LED_ON( void )
{
    PORTB |= 0b00001000; // PB3
}


void LED_OFF( void )
{
    PORTB &= 0b11110111; // PB3
}

And strangely I2C stopped working again! It only works if I put PB1 high/low. I still can't understand if I just happened to nail the perfect delay required and it just happens to be that turning PB1 on takes less time than turning the PB3 or it has something do with the circuit itself and LED doing some kind of pull-up/pull-down functionality (forgive my ignorance, I'm a beginner) on the I2C, but then again PB1 is not connected to the I2C lines at all.

Can anyone please shed some light on why is it only working when I turn on PB1 on/off, instead of doing a real delay? Thanks!

The full source:

#define PORT_SDA PB0
#define PORT_SCL PB2

#define SIGNAL_HIGH(PORT) PORTB |=  ( 1 << PORT )
#define SIGNAL_LOW(PORT)  PORTB &= ~( 1 << PORT )

void delay();
void LED_ON(void);
void LED_OFF(void);

void i2c_init(void);
void i2c_start(void);
char i2c_read(void);
void i2c_stop(void);
void i2c_nack(void);
void i2c_ack(void);
void i2c_ack_slave(void);
void i2c_write(uint8_t byte);

void i2c_init()
{
    DDRB = 0b00000010; // TODO: should be removed once the weird delay issue is solved
    DDRB |= ( 1 << PORT_SDA );
    DDRB |= ( 1 << PORT_SCL );
}

void i2c_start( void )
{
    SIGNAL_LOW(  PORT_SCL );
    SIGNAL_HIGH( PORT_SDA );
    SIGNAL_HIGH( PORT_SCL );
    SIGNAL_LOW(  PORT_SDA );
    SIGNAL_LOW(  PORT_SCL );
}

void i2c_stop( void )
{
    SIGNAL_LOW(  PORT_SCL );
    SIGNAL_LOW(  PORT_SDA );
    SIGNAL_HIGH( PORT_SCL );
    SIGNAL_HIGH( PORT_SDA );
}

void i2c_ack(void)
{
   SIGNAL_LOW(  PORT_SDA );
   SIGNAL_HIGH( PORT_SCL );
   SIGNAL_LOW(  PORT_SCL );
   SIGNAL_HIGH( PORT_SDA );
}

void i2c_nack(void)
{
   SIGNAL_HIGH( PORT_SDA );
   SIGNAL_HIGH( PORT_SCL );
   SIGNAL_LOW(  PORT_SCL );
}

void i2c_ack_slave(void)
{
    SIGNAL_HIGH( PORT_SCL );
    SIGNAL_LOW( PORT_SCL );
}

void i2c_write(uint8_t byte)
{
    uint8_t bit;

    for ( bit = 0; bit < 0x08; bit++ )
    {
        if( ( byte << bit ) & 0x80 )
            SIGNAL_HIGH( PORT_SDA );
        else
            SIGNAL_LOW( PORT_SDA );

        SIGNAL_HIGH( PORT_SCL );
        SIGNAL_LOW( PORT_SCL );
    }

    // Clear both lines (needed?)
    SIGNAL_LOW( PORT_SCL );
    SIGNAL_LOW( PORT_SDA );

    i2c_ack();
}

char i2c_read(void)
{
    uint8_t B = 0;
    DDRB &= 0b11111110; // switch PB0 to input

    for ( int bit = 0; bit < 0x08; bit++ )
    {        
        delay(); // <-- the root of all evil

        SIGNAL_HIGH( PORT_SCL );

        B <<= 1;

        if( PINB & (1 << PB0 ) )
        {
            B |= 1;              
        } 
        else
        {
            B |= 0;              
        }

        SIGNAL_LOW( PORT_SCL );
    }

    DDRB |= 0b00000001; // switch PB0 as output

    i2c_nack();

    return B;
}


void delay()
{
    LED_ON();
    LED_OFF();
}


void LED_ON( void )
{
    PORTB |= 0b00000010;
}


void LED_OFF( void )
{
    PORTB &= 0b11111101;
}

Upvotes: 0

Views: 654

Answers (1)

tofro
tofro

Reputation: 6063

I2c defines a number of minimum timings for signals - Important here are the HIGH and LOW time of SCL - the amount of time SCL should be stable before the next transition to the opposite state is allowed. These timings are typical ~5µs, exact figures should be taken from the datasheet.

The loop-around at the end of your read loop takes somewhat between 2 and 3 instructions, depending on what the compiler does. An AVR instruction, depending on your clock rate, roughly takes ~200ns, so (without a delay) SCL is low for approximately 600ns, give or take - which is way too short, at least apparently for your specific "other-end-device".

When you inserted a function call and a port access in the called function, you inserted enough instructions to keep SCL LOW for long enough to properly work.

In your code, the HIGH time is not so much the problem, because you let the AVR execute more instructions while SCL is high - Apparently, enough time goes by to keep your SCL HIGH for long enough.

The fact that you are toggling a port pin in your delay function is not relevant here - The only relevance is that you need to spend some time while SCL is low. Obviously, what you currently do is a waste of a port pin to just spend some time waiting - Use delay_us, experimenting with the delay instead of that. But check "the other end's" data sheet for the exact timing needed, 4-5µs should be fine.

Why your delay loop didn't work? It was most probably optimized away by the compiler that recognised you didn't do something relevant in that empty loop.

Ideally, you should try and read SDA in about the middle of the HIGH phase of SCL - with an un-rolled loop for 8 bits and some delay_us spread in that should work perfectly.

Upvotes: 1

Related Questions