<html><head><style type="text/css"><!-- DIV {margin:0px;} --></style></head><body><div style="font-family:'times new roman', 'new york', times, serif;font-size:12pt;color:#000000;"><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif; font-size: 13px; ">Dear all,</span></div><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif; font-size: 13px; "><br></span></div><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif; font-size: 13px; ">I have a problem with the data acquisition in TestManager. </span></div><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif; font-size: 13px; "><br></span></div><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif; font-size: 13px; ">First of all, my specs:</span></div><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif;
font-size: 13px; ">etherlab 1.2-rc10</span></div><div><span class="Apple-style-span" style="font-family: arial, helvetica, sans-serif; font-size: 13px; "><span class="Apple-style-span" style="font-family: times, serif; font-size: 16px; "><div>My linux version is: openSUSE 11.2 "Emerald" - Kernel \r</div><div>My RTAI version is 3.7.1</div><div>Matlab is R2009b, Simulink version is: v 7.4 and RTW is: v 7.4</div><div><br></div><div>Now, I have just learned through experimenting that the "fundamental sample time" in the configuration Parameters in my Simulink model i have built, is actually the maximal sampling frequency that will appear in TestManager , in the Data Transfer option for every channel. This is quite a find for me because nobody explains this stuff anywhere! The Testmanager documentation is in German (lol!), all I have is an english translated documentation for TestManager 3.4 which is not good at all.</div><div>Ok, now that I finnaly can
sample at the sampling rate I want, next step is to choose the accuracy for each channel, right? Not quite! Well it turns out that whatever accuracy i put in there (3,4...9) and then trying to acquire some channels using the Scope, and then exporting the acquired data in Matlab-format, Testmanager never respects/maintains the accuracy number i put in there. For example if i select 1000Hz sampling time + 3 accuracy (3 digits after the "." in the number) for a channel, then after exporting, Testmanager just spits out data @ 1000Hz with 16 floating point digits after the "." in the number. This of course has serious impact because for 10 seconds of data i get files of 11-20 mb for 1 channel -> aka not good!</div><div>May I ask you: "what is going on?" because I have no idea. </div><div>Thank you for your time!</div><div><br></div><div><br></div><div>Alex M.</div></span></span></div><div style="position:fixed"></div>
</div><br>
</body></html>