After I cut the cable, I discovered that many shows I wanted to watch were 
readily downloadable on the internet.  Sometimes I had to wait a week 
(more likely 8 days) to get them, and I didn't have to pay for any of it. 
The video files were either full HD (1080p) or 720p (which also looks 
great) and the viewing experience was better than with the DVR -- I have a 
Linux PC connected to one of the HDMI ports on my TV and I use vlc to 
watch the videos.  Best of all, there are no ads in any of these videos, 
so nothing to "zap".

I use the program youtube-dl to download the videos.  It uses ffmpeg.  I 
installed it from the Ubuntu repository, which gave me the very helpful 
man page, but there is a problem with that:  it requires frequent updates, 
which are very fast and easy to do, but they don't work with the version 
from the repository.  To fix that, after doing the repository-based 
installation, I used wget to download using these instructions:

https://rg3.github.io/youtube-dl/download.html

Then I put that in a directory early in my path so that it is called 
instead of the repository's version.  After that I can use the -U option 
to update, and it works.  Before that I get this error:

$ youtube-dl -U
Usage: youtube-dl [OPTIONS] URL [URL...]

youtube-dl: error: youtube-dl's self-update mechanism is disabled on Debian.
Please update youtube-dl using apt(8).
See https://packages.debian.org/sid/youtube-dl for the latest packaged version.

Then I use this one-liner bash script when I download TV shows:

#!/bin/bash
youtube-dl -ci -o "%(uploader)s - %(upload_date)s - %(title)s [%(resolution)s].%(ext)s" --write-auto-sub --write-sub --sub-lang en,en-us --sub-format srt "$@"

I usually rename the files and move them to their own directories.  This 
is a web page I made to keep track of my downloads:

http://selectionism.org/mbmiller/tv/

I use a cool trick to maintain the "Latest File" info in the web page -- 
that date isn't text, it's an image.  The image is updated by a script 
that uses ImageMagick "convert".  Here's that script:

--------------start script on next line-----------------
#!/bin/bash

DIR=/media/mbmiller/where/my/tv/network/directories/are

for PROGRAM in Independent_Lens POV Finding_Your_Roots Frontline Nature NOVA Secrets_of_the_Dead American_Experience American_Masters Austin_City_Limits Supergirl Blackish Goldbergs Mom Crazy_Ex-Girlfriend Once_Upon_a_Time Saturday_Night_Live 60_Minutes Simpsons Last_Man_on_Earth American_Idol American_Housewife Roseanne Colbert ; do
     ls -1 ${DIR}/*/${PROGRAM}/*20[0-9][0-9]-[0-1][0-9]-[0-3][0-9]*.mp4 | tail -1 | \
         perl -pe 's/^.* (20\d\d-[0-1]\d-[0-3]\d) .*$/$1/' | \
 	convert -size 96 -font Verdana label:@- /home/mbmiller/www/tv/images/${PROGRAM}.gif
done
-------------end script on previous line----------------

It's a pretty simple scheme: $DIR contains subdirectories that are network 
names like PBS, ABC, CBS, NBC, Fox, etc.  Those subdirectories contain 
subdirectories that are (unique) program names like Finding_Your_Roots, 
Blackish or Crazy_Ex-Girlfriend.  The TV program video files always have 
the air date in them and it's always in the yyyy-mm-dd format.

Example:

$ cd /media/mbmiller/where/my/tv/network/directories/are
$ ls -1 ABC/Blackish/*20[0-9][0-9]-[0-1][0-9]-[0-3][0-9]*.mp4 | tail -1
/ABC/Blackish/Blackish - S04 E23 - 2018-05-15 - Netflix & Pill [1280x720].mp4

For this to work, the filenames when sorted by ls have to be in 
chronological order.

FWIW, there's my scheme.  One last issue -- you can't wait too long to 
download the videos or they won't be freely available.  They seem to 
always be available for several weeks, so it isn't a big rush.  I usually 
check once per day what is available and I download them.  You can run one 
youtube-dl or (yttv.bash) command with many video URLs listed as options. 
Renaming the files is the bigger job, but the "rename" command is pretty 
helpful for that.  I also convert .vtt and .ttml files to .srt.

Best,
Mike