FTDI Community

Please login or register.

Login with username, password and session length.
Advanced Search  


Welcome to the FTDI Community!

Please read our Welcome Note

Technical Support enquires
please contact the team
@ FTDI Support

New Bridgetek Community is now open

Please note that we have created the Bridgetek Community to discuss all Bridgetek products e.g. EVE, MCU.

Please follow this link and create a new user account to get started.

Bridgetek Community

Show Posts

You can view here all posts made by this member. Note that you can only see posts made in areas to which you currently have access.

Messages - Randy

Pages: [1]
Discussion - Software / Re: Interfacing to FT232H
« on: February 19, 2018, 02:50:19 PM »

Using the FlushBuffer() method doesn't seem to make a difference.  I had been putting it after setting the clock division register in the sequence of events.  I added it higher up in the sequence, after SetBitMode() as you suggested.  That didn't make a difference either.  Maybe I'm mixing up a variable somewhere.

Sometimes the second try of sending a bad command also gets more (or less) than 2 bytes.  I will just do the send bad command read result sequence four times instead of two.  That seems to always work, and it takes virtually no extra time.


Discussion - Software / Interfacing to FT232H
« on: January 28, 2018, 12:30:38 PM »

I am writing code (in C#) to interface to an FT232H (actually a C232HM-DDHSL-0 cable).  I am using the FTDI FT2XX DLL, its C# wrapper, and MPSSE.

In looking through all the different code examples on the FTDI site, there seems to be a general sequence of commands one sends in order to get the chip set up properly.  And in all of the different versions (C#, C++, VB, Delphi), there is a place where two bad commands are sent in a row, one to 0xAA, and one to 0xAB.  What is the purpose of doing that? 

The code in the examples are all written so that first the bad command is sent, then a routine is called to read the data back.  That data should be two bytes, first 0xFA, then the bogus command byte.  The read routines are generally written such that the queue is interrogated to first see how many bytes are available, and then that many bytes are read, and finally the first two bytes are checked to make sure they byte[0] = 0xFA and byte[1] = <the bad command that was sent>.

After start up in my code, the very first time this bad command sequence is run it always reads some random number of bytes and some random values (though many times it seems to be byte[0] = 0x40 and byte[1] = 0x40.  The second time and all subsequent times that bad commands are sent, it always returns 2 bytes with byte[0] = 0xFA and byte[1] = <bad command>.

Is this the way it is supposed to be working?

What is the purpose of this?

I am writing my code for SPI and while there is a set of functions for MPSSE and SPI, I have not yet gotten a chance to make the C# wrapper.  So I am setting the SPI up in my code.  So far so good.


Pages: [1]