CV Protection

Hello,

I am wondering how to best protect the CV in of a 5V circuit from 12V input. Do I just use an opamp that is 12V tolerant? If so, how do I best adjust its output range?

Best,
Karg

What do you call a “5V circuit”?

You mean the ADC input pin of a microcontroller powered by +5V?

You can have a look at the schematic of grids or branches

@pichenetes: yes, exactly :slight_smile:

@TheSlowGrowth: thanks, I’ll have a look at that! :slight_smile:

I always do it with a rail to rail op-amp, in inverting configuration. In this configuration, the op-amp input is never directly exposed to the external voltage. MCP600x is a good candidate for this task: it’s super cheap and designed for low voltages, like MCUs. No need to use anything fancier - the MCU’s ADC will be the weakest link in terms of noise and bandwidth anyway.

See Branches and Grids’ schematics for the +5V version, and all the other recent modules for the +3.3V version.

More explanations as to why it’s a good design here

It has a tiny drawback: your ADC readout is maximal when the CV has its minimal value and vice versa. But it’s not a big deal to do the subtraction (and if necessary add a constant to a null offset) in software.

Ah, great info. Thanks for the link and explanation.

Yes, great!
This board is always a source of such great information!

I had a question on MW that seemed to have been missed; it was asking Olivier why he has changed the processing (e.g. scaling, offsetting) of CV from the analogue domain in earlier modules such as Braids/Grids etc., to multiplexing the CV and doing it all (apparently) in software for later work e.g. Elements/Rings. Any insight?

Makes calibration possible entirely in software?

a|x

That was my guess, or perhaps it makes more elaborate Easter egg cyphers possible :). It could also be hardware-oriented – less components (+ failure modes) and easier track routing for serial data; more analogue inputs doable with less IO pins required.

Scaling and offsetting is still done in the analog domain - the minimum acceptable CV is mapped to an ADC input voltage of 3.3V, the maximum acceptable CV is mapped to an ADC input voltage of 0V.

The extra CV filtering found in Rings/Elements’ is simply due to the fact that the STM32F4’s built-in ADCs are crap.

The only thing that has truly changed is that the pot, attenuverter and CV are now acquired through independent ADC channels and are combined in software. The reason for doing so is that it gives me freedom to tweak the law of pots, and to add “virtual notches” to them - for example the attenuverters have a small plateau near 12 o’clock, and a quadratic or quartic law depending on the parameters. I have started doing this with Tides.

It also makes factory testing easier, since I can test the pot, attenuverter, and CV input separately (otherwise: you notice a strange offset on a parameter - how do you know that it comes from the main pot, from the CV input, or the attenuverter circuit?).

From the perspective of the user, hardware attenuverters like those used in Braids make it more difficult to adjust subtle modulation amounts, and since they work by crossfading the original signal and the signal inverted through an op-amp, the circuit cannot entirely cancel very fast edges. There’s also the issue of ADC resolution… If your ADC has a resolution of 10-bit and if your hardware attenuverter attenuates the signal by a factor of 16, you’ll capture the signal with only 6-bits of resolution. Whereas doing the attenuation in software preserves resolution, especially in the recent modules which all use 32-bit floating point numbers internally.

Note that multiplexing is kind of a detail here: there’s multiplexing only because there are more pots / attenuverters than ADC inputs, and throwing in a 4051 is cheaper than using a MCU with more pins.

1 Like

That plus the software normalization trick in Warps & co, right?

Yes, it makes software normalization possible too :slight_smile:

Thanks a lot for the answer, it makes good sense. I’m actually working on a CV input scaler at the moment and because I’m a programming pleb I’m going for the analogue solution. Using two inverting amps in series I can switch in resistors and offsets to optimise for +5V, +10V and bipolar 5V levels, all protected to 3v3 using the rail to rail idea (thanks for the inspiration).

On the DAC side the opposite occurs, and I’ve implemented a “fine tune” bipolar offset pot using diodes to create a centre deadband. Just have to decide on “innie” vs “outie” output protection resistors; at the moment I’m leaning towards a low value “outie” to decrease cable capacitance effects because I don’t think the TL07X really cares about short circuits…

> all protected to 3v3 using the rail to rail idea (thanks for the inspiration).

I’m aware of another product from another brand (not released yet) which uses something like that on its inputs / outputs, to guarantee the best use of the AD / DA resolution.

But the two op-amps are a bit of a luxury here.

> I don’t think the TL07X really cares about short circuits…

I’ve never fried one.

The only argument for using 1k instead of something smaller would be to allow mixing through passive mults or stackables…