What is the history of football in america?

What is the history of football in america?

In the 19th century, on the college campuses of America, a new sport was born. It had a bit of rugby's toughness and a touch of soccer's finesse. The first official game happened in 1869 between Rutgers and Princeton, marking American football's start. As time went on, the game gained fans rapidly and became hugely popular nationwide. In 1920, the NFL was created, establishing football's status as a big American pastime. Today, football stands tall as one of America's most beloved sports, bringing together millions of fans in a thrilling spectacle of athleticism and teamwork.
Back to blog