I’m making a piece of hardware which can process the voltage from a pot. An STM32F013K6 takes the value from the pot and then i use a look up table to process this value and send it to a 12 bit DAC, after it used as a VCA CV.
The data from the ADC is 12bits and the LUT is a 256 (8 bit) list of uint16_t. i create an 8 bit number from the original 12 bit - I right shift to get rid of the lowest 2 bits (to improve the noise) then mask the two highest bits, leaving an 8 bits number which I use as an index to the LUT, so the LUT cycles around four times.
Unfortunately this has proved very noisy. two adjacent numbers in the lookup table read (for example) 416,448 which as im using a 3.3V reference on the DAC is 3.3*32/2^12 = ~26mV, added to this I still have some noise on the first two or three bits (the first four/five of the original ADC reading are noisy) so I end up with sometimes 200mV of noise on the DAC output.
Does anyone have any idea how I can implement this better? I thought of averaging out some of the noise on the ADC readings and maybe using linear interpolation to keep the noise on the lookup table down, but I don’t know if I’m missing something simpler, as is my first real project using STM32.
any help much appreciated