Re: Random TV question: 'juddering' in fast pans
The answer is that cinema features are filmed at twenty-four frames per second. That's enough for the illusion of movement, but not for smooth panning. Cinematographers and directors go to great lengths to distract the viewer from this phenomenon when tracking their subjects.
The reason you're seeing it on the F1 Now feeds is different, and could be display-rate mismatches between the source stream (50fps if it's a British broadcaster) and your tablet's display (60fps, as most display LCDs are) - you'll get a kind of "6:5 pulldown" - six displayed frames fed by 5 frames of input material: one is doubled. The streaming server could also be dropping alternate frames to save bandwidth, thus reducing the time resolution to 25 images per second.
The rest of this is long and a little rambly, so you can stop reading here.
This low frame rate is a legacy of the technology that was available in the 1930s. At that time, the available mechanisms could not pull a new frame into the camera any faster without tearing or slipping. (24 frames a second isn't fast, but remember that the new frame is pulled up into position in the tiny fraction of a second that the projector's shutter closes, so the mechanism needs to be much faster).
Raising the frame-rate of cinema presentations is possible (has been since the early 1970s), but experiments with the viewing public showed that audiences do not like the effect of high frame rate cinema: it looks less "real" than the slower rate, and the reason has nothing to do with technology:
Traditionally, lower-budget TV drama was either broadcast live or captured on videotape, because video is far, far cheaper than film production (not least because videotape is reusable and takes can be reviewed instantly). Video did have one advantage over film, however, which is that it has a time resolution of 50 or 60 fields-per-second, which makes motion, and especially panning, much smoother. Higher-budget TV shows were still filmed on 16mm or 35mm cinema stock, at 24fps, because this allowed exterior and interior shooting (cheap drama was studio-bound; video cameras were too at first) and it got around the issues of selling your programming to a station with an incompatible video system. A desirable side effect is that these telecine presentations looked like "a real film" rather than "a TV show".
But, thanks to the historic use of video for cheap TV drama, a high frame rate is now almost indelibly associated in the audience's mind with low-budget videotaped TV shows or live events. Simply displaying a film feature at a high frame rate will suggests the same low-budget, "unrealistic" experience to cinema viewers, or will make it seem like they're watching a live broadcast - by making it look like the actors are "right there", it also makes it look more like they're "just actors", thereby destroying some of the suspension of disbelief.
The recent release of Peter Jackson's"The Hobbit" was offered at both 48 and 24 fps, but audiences responded that the 24 fps showing was "grander" and more "epic" than the 48. The 48fps was reported by contrast as being "like watching TV", and "fake". The 48 fps presentation was either dropped for its sequel, or very few cinema owners took it on.
If you've got a TV with motion interpolation, as most modern LCDs do, watch a BluRay source of a big blockbuster film first at its native 24fps, and then with all of the motion gubbins turned on. The latter might look more "real" but it also looks less "cinematic".
And in the same effect from the opposite side: pretty much all TV productions are now filmed digitally, and broadcast at 50/60 fps, but big-budget drama is either shot natively at 25 or 30 fps or is de-interlaced down to a 25/24 frame-rate in post production to a achieve a "filmic" look.
Like it or not, 24 frames per second is considered as a sign of "quality" by the viewing public.