After figuring out how to drive the AD7195 analog-to-digital converter and tons of tests with the DIY Wheatstone bridge simulator that I described in my last post, I'm finally at the stage of connecting a "real" pressure transducer to the ADC, and it is this Wika TTF-1 600-bar sensor "kindly donated" by a busted ServiceJunior digital pressure gauge:
But first, I need to say a few words about the small "hurdle" I ran into during my tests. It's actually pretty funny and dumb at the same time.
I wrote a small piece of code to convert ADC readings into bar and display them on the graphic LCD of the development board that I used (Silicon Labs Wireless starter Kit BRD4001A with a BGM13P22 Bluetooth board) and, naturally, I included the detection and display of min and max pressure spikes - the 1.28 inch LCD is tiny, but it has the resolution of 128x128 pixels, which is more than enough to display multiple numeric values at the same time (something that the ServiceJunior gauge can't actually do - it displays either max or min, but never both). So, as I was testing the program with the bridge simulator, I saw that it would occasionally register a spike with an impossibly high value - all 15 bits of the 24-bit conversion result, shifted by 9 bits, were recorded as 1s. Naturally, I thought there was a bug in my code, like a bad pointer or something similar, and I spent quite a lot of time searching for it without any success - until I accidentally discovered that the spike was being registered every time I would turn my desk lamp on or off! There were no bugs in the code - the unshielded bridge simulator was simply picking up the EMI noise from the arcing mechanical switch! Just a heads-up if you ever decide to work with sensitive ADC circuits - shielding is, apparently, super important!
Now, let me show you the improvised enclosure I made for the transducer and the ADC board:
I took an old 22-mm metric fitting (the ADC board is exactly 22 mm in diameter), and lathed out one of the ends so that the transducer could fit tightly into it, like so:
Then I welded the communication and power wires to the ADC board:
And then I welded the transducer wires and inserted the board into the enclosure, securing it with a hollow insert that I lathed from a piece of scrap (if you know how a metric DIN fitting works, you'll see what I did here):
And, finally, this is the finished "apparatus" in action (the board with buttons is for the interface I am developing - I'll be disclosing the details in a separate post):
As you can see, I am using a 1m-long capillary hose to connect the Pressure Maker to the pressure transducer (exactly the hose I used with an analogue gauge in my post describing the motivation behind this project), and this means that I can (finally) show you how the 600 bar transducer coupled to a quality ADC manages to detect the changes in hydrostatic pressure of the column of oil inside the capillary when its orientation is changed.
The development board gets its power from a USB port, so the 5V powering the analogue part of the ADC is actually quite noisy - the 5V rail drops to almost 4V for about 10 microseconds at a rate of 300 HZ:
With this power supply, I had to "shave" 8 bits off of the 24-bit conversion result and add mild oversampling to get flicker-free readings at a rate of 2 readings per second and 16628 counts, effectively attaining the resolution of 0.04 bar. This is the hi-res mode - for the normal mode, I am discarding 9 bits, and getting about 5 readings per second and 8314 counts (for the 600-bar range). All this after the calibration, of course, which I did using my regular 600-bar digital pressure gauge. This, actually, raises an interesting question - I don't have a "proper" digital instrument to calibrate/assess this "prototype gauge!" I guess I need to come up with a DIY Deadweight pressure tester for a more reliable and repeatable pressure source - a nice idea for a future project, don't you agree?
In theory - even this resolution should be enough to detect the tiny changes of the pressure inside the capillary - a 0.5m oil column produces about 0.04 bar of hydrostatic pressure - and I am happy to declare that it worked exactly as expected. This is me changing the orientation of the capillary hose:
And this is how the "measuring system" responded:
First - I zeroed the gauge. Then, as I raised the capillary to about the height you see in this picture - the gauge read "0.03 bar", and when I stretched the capillary all the way up - it went to "0.07 bar", and then as I lowered the capillary end below the table - I got "-0.03 bar". Awesome!
Also - I almost forgot - the improvised steel enclosure made a perfect EMI shield - no matter how hard I tried switching my table lamp on and off (among other things) - the ADC wouldn't register even the smallest spike!
I still have a lot of work to do with the graphic interface and the firmware, but one thing is crystal clear at this point - you absolutely can reliably detect the tiniest changes in pressure even with a hefty 600-bar transducer when you connect it to a quality analog-to-digital converter!