A three-digit number determines where you live, what you drive, whether you get hired, and how much you pay for insurance. It was supposed to make lending fair. It made something else entirely. The Invention of Credit Scoring In 1956, two Stanford mathematicians named Bill Fair and Earl Isaac founded Fair, Isaac and Company in San Francisco. Their idea was simple: replace the subjective judgment of bank loan officers with a statistical model. Loan officers had enormous discretion, and that discretion was routinely exercised against women, minorities, and anyone who did not fit the profile of a "reliable" borrower. The first FICO credit bureau risk score was introduced in 1989. By the mid-1990s, it had become the industry standard. Today, FICO scores are used in over 90% of U.S. lending decisions. The pitch was objectivity. The reality was something more complicated. The Bias Built Into the Algorithm Credit scores do not measure financial responsibility. They measure financial behavior within a system that was already unequal before the score existed. The factors that determine a FICO score — payment history, credit utilization, length of credit history, credit mix, and new credit inquiries — are all shaped by wealth, income, and access. People who were systematically denied credit for decades have shorter credit histories. People who were redlined into neighborhoods with fewer banking options rely more on alternative financial services that do not report to credit bureaus. A 1997 internal FICO analysis — confirmed by the company — found that consumers living in minority neighborhoods had lower overall credit scores. A 2024 National Consumer Law Center issue brief titled "Past Imperfect" documented how credit scores "bake in and perpetuate past discrimination" by encoding historical disadvantages into a number that pretends to be race-neutral. The average credit score for Black Americans is approximately 65 points lower than for white Americans. The score does not ask why. It simply reports the disparity as if it were a measure of individual merit. Errors That Destroy Lives The Federal Trade Commission conducted a landmark study on credit report accuracy, published in 2013. The findings: 26% of consumers had at least one potentially material error on at least one of their three credit reports
5% of consumers had errors serious enough to result in less favorable terms for loans, insurance, or employment
Of consumers who disputed errors, 69% still had inaccuracies on their reports after the dispute process These are not rounding errors. They are people denied mortgages, charged higher interest rates, or rejected for jobs because a data entry mistake at one of three credit bureaus went uncorrected for years. The dispute process is notoriously opaque. Consumers are required to navigate a bureaucratic maze while the burden of proof falls on the individual, not the bureau. Credit Scores Beyond Lending The original purpose of credit scores was to assess lending risk. That purpose expanded quietly and without public debate: Employment: A 2012 survey by the Society for Human Resource Management found that 47% of employers checked credit reports for some or all job candidates. The practice is most common for positions involving financial responsibility — but it is also used for entry-level jobs where financial access is minimal.
Housing: Landlords routinely screen tenants using credit scores. In competitive rental markets, a low score can effectively exclude someone from housing entirely.
Insurance: Most auto and homeowners insurance companies use credit-based insurance scores to set premiums. The Federal Trade Commission itself confirmed in a 2007 report that these scores are predictive of insurance claims — but also acknowledged that they disproportionately affect minority consumers.
Utilities: Electric, gas, and water companies may require larger deposits from customers with low credit scores. A number designed to predict whether you will repay a loan now determines whether you can turn on your lights. China's Social Credit System: The Logical Extreme China's Social Credit System is frequently cited as a dystopian alternative. The reality is more complex than Western media portrayals — but the trajectory is instructive. China's system, announced in 2014 and implemented in stages, aggregates data from financial records, court judgments, and government databases to assign trustworthiness ratings. Consequences for low scores can include restrictions on travel (banning individuals from flights and high-speed trains), reduced access to loans, and public blacklisting. A 2019 "national model" social credit system in a city of 1 million people used 389 rules: 124 to reward "good" behavior and 265 to punish "bad" behavior. Western observers call this surveillance. But the fundamental mechanism — reducing complex human behavior to a score that determines access to basic services — already exists in the United States. The difference is one of degree, not kind. The Alternatives We Do Not Discuss Several alternatives to traditional credit scoring exist but receive minimal attention: Manual underwriting: Some lenders, including a few mortgage providers, still evaluate borrowers based on actual financial documents rather than algorithmic scores
Rental payment reporting: Programs like Experian Boost and eCredable allow on-time rent and utility payments to be factored into scores, but adoption is limited
Alternative data models: Some fintech companies are experimenting with scoring models that use banking transaction data instead of traditional credit bureau data
Public credit registries: Several countries operate government-run credit reporting systems with more transparency and consumer protections than the U.S. model None of these address the fundamental problem: a single number should not be the gateway to participation in society. What It Actually Controls Credit scores were sold as a tool for lenders. They function as a mechanism for sorting people into categories of access and exclusion — categories that reflect existing inequalities while claiming to be objective. The score does not care why your credit is poor. It does not distinguish between medical debt and reckless spending. It does not account for systemic discrimination. It simply assigns a number and moves on, while that number follows you for years. They did not ask if we wanted to know that a fair system would look different than this. The number speaks for itself. _- The Department_