When was the first credit card invented

At Experian, one of our priorities is consumer credit and finance education. This post may contain links and references to one or more of our partners, but we provide an objective view to help you make the best decisions. For more information, see our Editorial Policy.

For better or worse, credit cards are a cornerstone of the American economy. At the end of 2017, the average American held 3.1 credit cards with an average balance of $6,354—plus 2.5 retail credit cards with an additional balance of $1,841, according to Experian's State of Credit report. But when were credit cards were invented?

"Paying with plastic" is so commonplace that total U.S. credit card debt topped $1 trillion last year, according to the Federal Reserve.

But have you ever stopped to think about how we got to this place? Perhaps the most amazing thing about credit cards is how relatively quickly they've become essential to modern capitalism.

Most historians trace the modern credit card to the founding of Diners Club in 1950, the first charge card that could be used to make purchases at multiple retailers. Diners Club was a new twist on an ancient practice.

Here is a (brief) explanation of credit cards.

Early Forms of Credit

For thousands of years, merchants have used credit to help their customers finance purchases. For example, seeds could be sold to farmers on terms that permitted payment after the harvest.

Some of the earliest written examples of a credit system include the Code of Hammurabi, named after the ruler of Babylon from 1792 to 1750 B.C., in what is now Iraq. These laws established rules for loaning and paying back money, and how interest could be charged.

Historically, a loan was a financial agreement between a single borrower and a single creditor or merchant. In more modern times, a customer might be able to "run a tab" with an individual merchant, which is a revolving line of credit that can be continuously borrowed against and has no fixed payoff date. This is the equivalent of a store credit card that's not part of a larger payment network.

Embed<iframe src="https://e.infogram.com/f5890759-0a30-4648-8d27-dd5024a5fd80?src=embed" title="Credit Cards Over History" width="550" height="925" scrolling="no" frameborder="0" style="border:none;" allowfullscreen="allowfullscreen"></iframe>

In the late 19th century and early 20th century, companies built on the idea of revolving credit to include a physical object that could be used to easily identify their customer accounts. Some were in the form of coins or medals that included the name and logo of the merchant, as well as the customer's account number.

Just like many credit card transactions in the late 20th century, the merchant would make an imprint of the coin or medal on the customer's sales slip. In the 1930s, these coins and medals evolved into rectangular metal cards called Charga-Plates that looked like something between a credit card and military dog tag.

Watch our Great Moments in Credit History video series on YouTube for more

The Final Countdown

With consumers carrying around rectangular metal cards that they could use to make purchases, there were just a few things missing before someone could create the modern payment card:

First, someone had to conceive of a financial instrument that could be used to make charges at multiple merchants. An early example was the Air Travel Card, which allowed travelers in the 1940s and ‘50s to purchase tickets on credit from multiple different airlines.

The modern payment card was created in 1950 by Ralph Schneider and Frank McNamara who founded Diners Club. This was the first general purpose charge card, but it required consumers to pay each month's statement balance in full.

Later, American Express and others would offer customers to option to carry a balance on their cards. This was the final innovation required to create the financial product that we would recognize as a modern credit card.

The Evolution of Credit Card Technology

At first, credit cards worked like the previous medals, coins, and plates. Merchants would simply make an imprint of the card, which would be familiar to anyone who remembers how many credit card purchases were made up until the 1990s. But by the 1980s, many cards started having a magnetic stripe on the back, which could be read by specialized computer equipment that was state-of-the-art at the time.

By today's standards, a magnetic stripe is considered primitive, as the information stored on it isn't even encrypted. Just as imprinting gave way to magnetic stripe readers, credit cards with embedded computer chips are now making magnetic stripes obsolete. These embedded computer chips, called EMV smart chips, allow for encrypted, two-way authentication between a merchant's credit card terminal and the payment processing network.

This technology dates back to the 1990s, and it was widely adopted in Europe over the last 20 years. However, it's only been in the last five years that America has undergone its migration to EMV-equipped cards and readers. The encrypted communications make it far less vulnerable to hackers, while the computer chips are much more difficult for criminals to counterfeit compared to simple magnetic stripes.

However, some industry experts suggest that the era of EMV smart chips may be relatively short, as wireless payment technologies are rapidly integrated into smartphones, watches and other wearable platforms. Finally, many foresee a day when biometric authentication allows consumers to charge purchases using a fingerprint or retinal scan, without having any object that contains their account information.

We've come a long way from the days of using metal coins to make charges, and the cards in your wallet may also be obsolete in the near future.

In 1949 one New York businessman, Frank McNamara, went to pay his restaurant bill after entertaining clients only to find he had left his wallet in another suit. From this potentially embarrassing situation (his wife was able to cover the bill) he came up with the idea of a Diners Club who issued their first cards to approx 200 customers in the USA who could use their new facility in selected restaurants in New York.

However, this was technically a charge card as the customer had to pay the entire amount as soon as billed by Diners Club. In 1959 Bank of America began issuing its own card in California which was the first widely available credit card accepted by a substantial merchant base. At the same time American Express issued their first credit cards for travel and entertainment charges which was accepted at participating restaurants, hotels and airlines.

In 1966 the Bank of America began forming licensing agreements with other banks enabling them to issue credit cards on a widespread basis and was promoted, particularly to traveling salesmen, as a time-saving device rather than a form of credit as instant access to their own banking facilities was difficult when not in their own area. 

In the same year 14 US banks formed the first banking association named Interlink, with the ability to exchange information on credit card transactions and the following year four Californian banks did the same in order to compete with the Bank America card, renamed Visa in 1976, who introduced the MasterCharge programme, renamed MasterCard in 1979. Thereafter banks wishing to issue credit cards joined either the Visa or MasterCard Association whose members shared costs, making access to the programme available to smaller financial institutions.

Barclays Bank was the first bank outside the USA to issue its own Barclaycard credit card in 1966 but it wasn't until the magnetic strip was introduced in 1970 that the credit card entered the technological era. Under this system, invented by IBM, a magnetic strip capable of storing data on a band of magnetic material on the reverse of the card is swiped through a magnetic reader. This system proved reasonably effective cutting down on both paperwork and fraud but still had a number of security problems.

To solve this, banks are replacing this system with the current technology of 'smartcards' or 'Chip & PIN' which contain an embedded microchip and can only be authenticated by using a Personal Identification No (PIN). In the UK it is common for merchants to refer to their terminals which can read credit cards as PDQ machines. This has arisen as Barclays Merchant Services (BMS), who were one of the first banks to introduce credit card machines in the UK, referred to their terminals as PDQ's and people got to associate the initials PDQ with credit card machines, just as vacuum cleaners came to be known as Hoovers after the company who pioneered the vacuum cleaner. And now card holders can pay for goods using contactless technology either with their payment card or SmartPhone. 

Business owners must apply to a bank for the facility to accept credit & debit cards as a form of payment via a credit card machine or over the internet. Most banks now offer mobile and portable terminals, commonly used by restaurateurs as well as the most common Static CounterTop. 

Having reached very high usage levels in the USA, UK and Canada, take-up of credit cards was much slower in many other developed countries of the world with many European countries not having anything near the penetration levels of the USA & UK until the mid to late 1990's. Today, however, there are countless variations on the basic concept of revolving credit including branded credit cards, store cards, commercial cards, etc and with the continually developing technology we can only wait and see how the market will evolve in the coming years.