I initially left this in a post in mathworks but realized this would be a much better place to post this:
When using the advanced TDMS palette in LabVIEW, you must specify the number of values per channel that is used to interleave data. When not interleaving and using only a single channel, LabVIEW specifically shows using the value 1 for this input. LabVIEW does not seem to track the total length the same way when using this advanced write function, possibly to save on write time by not tracking the exact number of samples? I am not sure. In the documentation it is recommended that the total channel length be calculated from the next segment offset and raw data offset in the lead-in of the file. Because of this, when trying to convert these files, only the first data value is converted. I have found a patch for my use case.
In this matlab script that data length is being calculated correctly and saved in SegInfo.DataLength but instead of using this value to find the number of samples to convert, convertTDMS line 789 reads in the samples per channel value written with into the properties of the channel which is not correct in this instance. Replacing line 789 "index.(obname).nValues(ccnt)=fread(fid,1,'uint64',kTocEndian);" with "index.(obname).nValues(ccnt)=SegInfo.DataLength/index.(obname).datasize;" fixes this when there is only a single segment with no metadata changes between subsequent writes to the file. I have not tested how this behaves when there are multiple channels, interleaving, or multiple different segments within a channel but I expect it should work for all of those cases. You will also need to account for not reading in that value by scanning ahead 8 bytes in the file.