My understanding is liberal harkens back to an age when both Republicans and Democrats thought of themselves as liberal, or even further back to what we call classical liberalism. But it really wasn't a political term until FDR hijacked it. The Progressive Era of the late 1800s was falling into disrepute and FDR needed to attach himself to another label when in fact he was a progressive. Still, I think you get some Democrats calling themselves liberal into the early 60s when they were still proud to be Americans. Many who we call liberal today have returned to calling themselves progressives who are now fully Woke.