Problem with calculated timestamps

Here is the code. If you attempt to state how it is wrong, bear in mind that it was written to make unit tests pass the enforce that a conversion from frame count to timecode and back to frame count (see other code in the same file) gives the same count for every 32-bit signed int. Any suggestion on your part must be in the form of a pull request that passes the unit tests included in the repo: