If the jitter was there I would notice it.
When displaying the 60 Hz source at 100 Hz, there will be issues; it's not random, though, so you might easily get used to it in its raw form: every third frame in the source is only displayed once, whilst all other frames are displayed twice.
Given that it's not random, jitter is probably the wrong word; it's actually temporal aliasing, like the so-called "wagon-wheel effect". When looking at temporal aliasing due to two asynchronous clocks, it can give the impression of jitter - i.e. uneven clock timing (2 frames per frame for two frames, then one frame per frame, then the pattern repeats). I think it's actually called "judder" in the case of digital video.
Now, many TVs allow the interpolation (by some method or other) of the source to the inflated display frequency, but most (if not all) of these introduce visual artefacts and significant latency. You may be using Samsung's equivalent of this interpolation, which is usually only designed for film-based sources - hence the inclusion of a "game" mode on some sets. 120 Hz, or 240 Hz, displays don't have judder issues with 60 Hz sources, but interpolation will still introduce (greater) latency. Similarly, a 50 Hz source displayed at 100 Hz is just fine, too. Thinking about that a bit more, it might be that the game is only rendering at 50 Hz, but I don't know if it does that over HDMI given most sets that use HDMI input can display at 60 Hz. (We can see now that accurate information about the exact display process is key.)
With good interpolation, the judder shouldn't be evident, but it's not a viable solution for many people because of the latency.
The real issue is the tearing, of course. I have no idea what interpolation does to screen tearing; I imagine it could diminish the starkness of tear lines somewhat, depending on the exact method used.