From Catalogs to Clicks: The Fair Lending Implications of Targeted, Internet Marketing (2024)

Consumer Compliance Outlook > 2019> Third Issue 2019

Consumer Compliance Outlook: Third Issue 2019

By Carol A. Evans, Associate Director, and Westra Miller, Counsel, Division of Consumer and Community Affairs, Federal Reserve Board

When introduced in the late 1880s, the Sears catalog became a powerful tool for African Americans, suffering underJim Crow and other forms of discrimination and segregation, to have the same shopping experience as whites.1 Duringthis time, African Americans routinely faced discrimination in retail stores, such as higher prices and a limitedselection of goods. The social disruption created by the Sears catalog prompted some white storeowners to encouragetheir customers to burn the catalog in the streets in protest.2 Recognizing the challenges that its African Americancustomers faced, Sears included instructions on how to place an order through the post office and providedother ways for rural African Americans, non-English speakers, and others who had been systemically excludedfrom American civil society to order from the catalogs.3 The anonymity of the catalog offered shoppers of allbackgrounds a level of retail inclusion that would take decades to achieve in physical stores. Moreover, the catalogbore other benefits: Sears offered credit that allowed African American farmers to buy the same items as their whitepeers, without the markup imposed when buying on credit at a local general store. The catalog’s priceswere also lower than those offered in the rural towns or countryside where many African Americans lived.4

From Catalogs to Clicks: The Fair Lending Implications of Targeted, Internet Marketing (1)

These benefits of the Sears catalog provide important lessons about financial inclusion. The anonymity offered byordering from a catalog leveled the playing field for African Americans and other disadvantaged groups. By ensuringthat everyone had access to the same products, Sears played a role in opening the marketplace for marginalizedconsumers. Ostensibly, the Internet could play this same role for modern consumers. However, today this broad-basedapproach appears to have been largely eclipsed by targeted marketing strategies designed to reach specificcategories of consumers and to undermine consumers’ anonymity.

Online advertising platforms, such as those offered by Facebook, allow companies to use vast amounts of consumer datato target marketing in a highly individualized manner by using sophisticated algorithms that will only displayadvertisem*nts to audiences or Internet users with desired characteristics. Although the anonymity of a catalog mayhave been an antidote to discrimination in face-to-face shopping encounters, today’s Internet leaves consumersmore — not less — identifiable as companies become more efficient at targeting certain demographics.

The results of this targeted marketing may be discriminatory in contexts in which consumer protection and civilrights laws apply, such as marketing credit. While the use of technology in consumer financial services, or fintech,has created many innovations that benefit consumers, the ability to filter the reach of marketing so narrowly canraise a range of consumer protection and financial inclusion concerns, including the fair lending risks of steeringand redlining. This article focuses on the increased use of Internet-based marketing practices to target audiencesby personal characteristics, geography, or even hobbies. This practice may explicitly or implicitly classify usersby prohibited characteristics protected under fair lending laws — such as race, national origin, or sex— and risk making financial inclusion out of reach for millions of consumers.

TARGETED MARKETING: CROSS-SITE TRACKING, LEAD GENERATION, AND E-SCORES

To a great and perhaps unanticipated extent, the combination of sophisticated analytic techniques and big data hasunmasked the anonymity of the Internet. In 1993, the New Yorker published its famous cartooncaptioned, “On the Internet, nobody knows you’re a dog.” However, nearly three decades after theNew Yorker’s observation, not only can web analytics recognize you are a dog, they also know yourfavorite toy, whether you chase squirrels, and the last time you wagged your tail. For humans on the Internet, thewealth of data include your current location, your neighborhood and its characteristics, your browsing and shoppinghabits, and the companies with which you do business.

This treasure trove of data about consumers can help enrich consumers’ experiences and provide financialbenefits tailored to their situation. For example, some investment companies and financial institutions useroboadvisors to provide customers with portfolios based on their financial and risk profile.5 Both bank and nonbankfinancial service providers are exploring whether the use of alternative data sources in credit scoring can expandaccess to credit to creditworthy consumers with limited or no credit histories.6

However, consumer data may also be used in ways that consumers have not intended or anticipated, often to fuelincreasingly sophisticated marketing strategies that aim to target certain consumer groups. Many consumers haveexperienced the feeling of being tracked on the Internet when the item they had been browsing on one website is nowbeing advertised to them on a second and then a third site. But companies now rely on consumers’ browsinghistories in less obvious ways as well. Through the use of sophisticated cross-site tracking, lead generation, andother techniques described more in this section, an immense amount of consumers’ personal data is now used todetermine the types of products advertised to individual consumers, eliminating any possibility of auniversal experience on the Internet.

Cross-Site Tracking

The advertisem*nts a consumer sees while browsing the Internet are the result of a complex interaction of severalinvisible activities. Websites track users and their browsing behaviors, with the goal of creating detailedprofiles that can be used for marketing purposes. Companies sell consumer data to third parties, which compilethese data from many sources.7 Tracking methods often include cookies or data files that are placed by thewebsite in a user’s web browser. These are used by website owners to identify the user and personalize thatuser’s experience on the site.8 Cookies can then be used to track users across websites.9

Methods may also include fingerprinting, in which each computer or device is given a unique identifier,allowing website owners to track when the same device visits that webpage again. This can be a powerful tool in atime when many devices are used by only one individual.10 These methods also allow website owners (or others) toshare the information they have collected and link together multiple profiles across different sites to yield a morefinely detailed view of a single consumer.11 Indeed, companies are able to determine if multiple devices belong to asingle consumer,12 and this information can be combined with offline data on consumers, such as data available fromretailers and credit card companies.13 Taken together, these techniques, as well as others, allow companies to buildever more-detailed profiles on individual consumers to target the marketing those consumers see.

Lead Generators

In addition to the consumer data available from tracking techniques, online lead generators collect data aboutconsumers by encouraging website users to volunteer personal information about themselves, often when users submitpersonal details to receive more information about a product or service. For example, a prospective homebuyer mightsubmit personal information when using a mortgage rate calculator. This information can then be sold to mortgagebrokers, credit card issuers, or others seeking details on prospective customers.14 Often, consumers may not evenrealize that the information they just entered on a website will be sold, at least until they start receivingunsolicited phone calls and text messages from companies they themselves did not contact.15

Lead generators may charge more for leads for consumers seeking credit, such as potential mortgage borrowers.16 Leadgeneration also can raise concerns about bias and exploitation. For example, at one time, the College Board’swebsite used personal information, such as whether prospective students expected to need financial aid, toimmediately filter the results presented by its search tool and direct those individuals to search resultshighlighting private for-profit colleges over potential private and public nonprofit colleges and universities.17

E-Scores

Companies buying leads in bulk may seek even more data on consumers to better distinguish potentiallyprofitable leads from those unlikely to result in a future customer. One method of predicting aconsumer’s possible future activity is to use online consumer scores, or e-scores, which are calculated usingcomplex algorithms and data mined from both online and offline sources.18 These privately calculated scores mayfactor in details such as occupation, salary, home value, and spending on certain consumer goods to predict aconsumer’s future spending and to allow companies to rank a consumer’s estimated future profitability.19A company might submit data sets containing the names of both leads and existing customers to an e-scoring service.

From those data sets, the e-scoring system would extract thousands of variables, identify predictive factors,and score the prospective customer leads based on how closely they resemble the company’s existingcustomers.20 E-scores are not new to the financial services sector. For example, a multinational credit card issuerused such scores to determine instantly what type of credit card to offer a customer calling into its call center.The scores also served to flag call center agents to speak to those customers who were thought to be“high-value.” Call center agents immediately routed those customers to agents, while callers who wereconsidered to be less attractive were routed to an overflow call center.21

In today’s world of targeted marketing, advertisem*nts are built for individual consumers, with advertisersable to target their audience by a vast range of increasingly specific characteristics, such as location, politicalaffiliation, or occupation.22 Companies rely on data on consumers’ past activity and on predictions of futureactivity. As with e-scores, marketing data companies predict consumers’ future activities by comparingspecific consumers with other consumers deemed to be suitably similar. The idea behind many of these predictivemodels is that “birds of a feather flock together.”23 Although some consumers may appreciate receivingtargeted online advertisem*nts instead of general ones, some stakeholders are raising concerns about these practicesas privacy researchers question how consumer data are collected and used, with some comparing it withexploiting natural resources.24

FAIR LENDING BASICS

The use of these and other targeted Internet-based marketing practices presents unique challenges, but it raises thesame core fair lending risks present in the traditional, offline marketing of credit products. Although suchdata-driven practices may offer new benefits, this type of marketing is not beyond the reach of fair lending laws.The Federal Reserve, along with other federal agencies, enforces two primary federal laws that ensure fairness inlending and apply to certain marketing activities: the Equal Credit Opportunity Act (ECOA) and the Fair Housing Act(FHA).

From Catalogs to Clicks: The Fair Lending Implications of Targeted, Internet Marketing (2)

The ECOA prohibits credit discrimination on the basis of race, color, religion, national origin, sex, marital status,age, or receipt of income from any public assistance program, or because a person has exercised certain legal rightsunder the ECOA and other financial statutes.25 The ECOA and its implementing regulation, Regulation B, apply to bothconsumer and commercial credit. Through Regulation B, the ECOA prohibits creditors from making oral orwritten representations in advertising or other formats that would discourage, on a prohibited basis, a reasonableperson from making or pursing a credit application.26 The FHA applies to credit related to housing and prohibitsdiscrimination on the basis of race or color, national origin, religion, sex, familial status, and handicap.27 TheFHA prohibits discrimination in advertising regarding the sale or rental of a dwelling, which includes mortgagecredit discrimination.28 These fair lending laws prohibit two kinds of discrimination: disparate treatmentand disparate impact. It is not uncommon that both theories may apply. Disparate treatmentoccurs when a lender treats a consumer differently because of a protected characteristic (e.g., race orage). Disparate treatment includes overt discrimination as well as less obvious differences in treatment. It doesnot need to be motivated by prejudice or a conscious intent to discriminate. The Federal Reserve has made a numberof referrals to the U.S. Department of Justice (DOJ) involving discrimination in pricing and underwriting, as wellas redlining. Many of these referrals have resulted in DOJ enforcement actions.29

Disparate impact occurs when a lender’s policy or practice has a disproportionately negative impact ona prohibited basis (i.e., protected characteristics), even though the lender may have no intent to discriminate andthe practice appears neutral.30 A policy or practice that has a disparate impact may violate the law, unless thepolicy or practice meets a “legitimate business need” that cannot reasonably be achieved by a means thathas less impact on protected classes.31 It is often possible to view issues raising fair lending concerns under bothdisparate treatment theory and disparate impact theory.

While the ECOA’s Regulation B and the FHA both provide specific prohibitions against discrimination ordiscouragement in the marketing of credit and/or mortgage credit, these laws also more broadly prohibit redliningand steering. Redlining is a form of illegal discrimination in which an institution provides unequal accessto credit, or unequal terms of credit, based on the race, color, or national origin of a neighborhood.32 Likewise,steering is a form of illegal discrimination in which applicants or prospective applicants for credit areguided toward or away from a specific loan product or feature because of their race, sex, or other prohibitedcharacteristic, rather than based on the applicant’s needs or other legitimate factors.33 Steering occurs whena bank’s actions are taken on a prohibited basis, even when those who have been steered are not measurablyharmed.34 These and the other protections of the ECOA and the FHA apply to credit marketing in the online world,just as they do in the offline one.35

HOW TARGETED MARKETING MAY RAISE FAIR LENDING CONCERNS

Technology has made it easier for businesses to use consumer data for direct marketing and advertising toconsumers who are predicted to be most interested in specific products. The ability to use such data for marketingand advertising may make it less expensive to reach consumers, resulting in a marketing strategy that may appearmore effective to the advertiser. However, when such strategies are used to market credit, they may raise fairlending risks. By enabling advertisers (or the technology companies they rely on) to curate information forconsumers based on detailed data about them, including habits, preferences, financial patterns, and where they live,there is a risk that this curation may result in digital redlining or steering. Likewise, when Internet-basedmarketing relies on artificial intelligence (AI) and machine learning (ML) technologies, the potential fordiscrimination may increase.

Facebooks settlement in March 2019 with several civil rights organizations and the relateddiscrimination charge issued by the U.S. Department of Housing and Urban Development (HUD) put a spotlight on theseconcerns.36 Facebook’s advertising practices initially drew attention when it was revealed that thecompany permitted advertisers to exclude groups of Facebook users with selected personal characteristics fromviewing particular advertisem*nts on the social media site.37 Facebook’s technology effectively allowedadvertisers to show advertisem*nts to certain users while excluding others based on sex or age, or on interests,behaviors, demographics, or geography that related to or were associated with race, national origin, sex, age, orfamily status.38

The advertising platform also permitted advertisers to create custom audiences of Facebook users who shared commoncharacteristics with the advertiser’s current customers or other desired groups.39 By permitting thesefeatures on its website, Facebook was alleged to have facilitated advertisers’ discrimination on multiplebases protected under the FHA because wide swaths of users were not able to view certain advertisem*nts solelybecause of their personal characteristics.

Facebook’s March 2019 settlement promised significant changes: The company agreed to retool its advertisingplatform and appeared to acknowledge the risk of digital redlining in its decisions to limit the filtering optionsavailable to advertisers, restrict geographic targeting to a minimum geographic radius of 15 miles from a specificaddress or from the center of a city, and disallow targeting by zip code. Likewise, it also seemed to addressthe harm caused when advertisem*nts are not broadly accessible; it agreed to build a tool that would allow anyFacebook user to view any advertisem*nt for housing or credit placed on the platform anywhere in the United States,regardless of the audience originally targeted for that advertisem*nt or where the viewer lives.40

In the days after the Facebook settlement, HUD also charged the company with housing discrimination because of thesepractices.41 HUD’s charge was the result of a formal, fact finding investigation of the social media companyby that agency, with HUD officials earlier noting that “[t]he Fair Housing Act prohibits housingdiscrimination including those who might limit or deny housing options with a click of a mouse,” andthat “[w]hen Facebook uses the vast amount of personal data it collects to help advertisers to discriminate,it’s the same as slamming the door in someone’s face.”42

The March 2019 discrimination charge provided significant detail regarding the company’s activities and allegedthat Facebook not only facilitated discrimination by advertisers using its platform but that the social media giantalso engaged in discrimination itself in how it delivered advertisem*nts to users. Specifically, HUD alleged that:

  • Facebook’s advertisem*nt delivery practices determine which users will actually see a particular advertisem*nt, regardless of the advertisers’ own preferences, and using user data that include sex and close proxies for other protected characteristics,
  • the company engages in price discrimination by varying advertisem*nt pricing based on the audience for each advertisem*nt, using user data that include sex and close proxies for other protected characteristics, and,
  • the company combines proprietary data about user attributes and behavior on its platforms with user behavioral data it obtains from other websites and from offline sources, then uses ML and other prediction techniques to classify users to project each user’s likely response to a given advertisem*nt, which has the effect of classifying users by protected characteristics.43

This final component of HUD’s allegations suggests that Facebook’s algorithms are applied to everyadvertisem*nt on its platform, regardless of the advertisers’ intent. That is, Facebook’s advertisingalgorithms allegedly operate independently of advertisers to determine which users will view advertisem*nts based onthe users’ predicted response.

As a result, these algorithms may potentially raise fair lending risks and render some advertisem*nts invisibleto certain users, disproportionately impacting users based on protected characteristics, such as race and sex.Indeed, an academic study of Facebook’s advertisem*nt delivery practices demonstrated just that, finding“previously unknown mechanisms that can lead to potentially discriminatory advertisem*nt delivery, even whenadvertisers set their targeting parameters to be highly inclusive.”44 The study’s authors publishedgroups of advertisem*nts on Facebook, where advertisem*nt features were varied to observe how changing a featurewould affect the demographics of the audience of a particular advertisem*nt. They found that the delivery of aparticular advertisem*nt may be skewed for reasons including the content of the advertisem*nt itself, the imagescontained in the advertisem*nt, and how advertisem*nt images are classified by Facebook. Indeed, according to theirresults, an advertisem*nt “can deliver vastly different racial and gender audiences” based solely on theadvertisem*nt’s creative content.45

For example, they created a suite of advertisem*nts advertising both rental housing and real estate for purchase.They varied the type of property advertised and the implied cost of the property (as implied by text referencing“fixer upper” or luxury). The delivery of these advertisem*nts was noticeably skewed based on race; someadvertisem*nts were delivered to Facebook audiences that were over 72 percent African American, while others weredelivered to audiences that were over 51 percent African American.46

Facebook’s advertisem*nt delivery policies appear to be driven both by profit and efficiency: It appears thatit may be most efficient to show advertisem*nts to consumers who are the most likely to want a certain product orjob because revenue is generated when consumers click on advertisem*nts. But efficiency in this context may be atcross purposes with bedrock principles of nondiscrimination. Even though more men than women, for example, mayarguably be interested in certain jobs, both the law and social goals of diversity and inclusion require thatboth genders are shown the advertisem*nts.

The HUD discrimination charge demonstrates the risks of relying on decision-making processes that are based on MLmodels that lack appropriate controls. With large volumes of consumer data now available from both online andoffline sources, a wide array of industries are looking to AI and ML to automate decision-making processes andimprove predictions of future outcomes because these technologies can find patterns or correlations in massive datasets that humans could not. However, ML algorithms are only as good as the data sets on which they are“trained.” It is this training data that teaches the algorithms what the outcomes may be for certainpeople or objects.

Incomplete or unrepresentative training data or training data that reflect real-world historical inequities orunconscious bias may lead to ML models that generate discriminatory results.47 For example, it was widelyreported last year that Amazon had invested several years in developing an experimental hiring tool that relied onAI to rate candidates for employment opportunities. However, the tool did not make gender-neutral hiringrecommendations as expected.

The tool had been trained using resumes that had been submitted to the company over a 10-year period, the majority ofwhich had come from male applicants.48 As a result, it appears that the tool had learned to replicate the long-standing underrepresentation of women in the technology industry and reinforced this as the norm by downgradingresumes with references to the word “women’s” and all-female colleges.49 Indeed, the concept ofunconscious bias in AI and ML models has received increased attention in recent years.50 Unfortunately, algorithmsdo not remove human bias; even automated processes cannot escape the weight of data that has been tainted by suchbias.51

The use of data-driven technology in marketing also raises additional risks for discriminatory outcomes. One concernis that consumers will be misidentified and not offered the full range of products for which they might bequalified. A news article reported that a bank used predictive analytics to instantaneously decide which credit cardoffer to show to first-time visitors to its website: a card for those with “average” credit or a cardfor those with better credit.52 This practice and others like it raise the possibility that a consumer might bedigitally steered to a subprime product based on behavioral analytics, even though the consumer could qualify for aprime product.

Another concern is that the intense curation of the information available to each consumer, caused in part bytargeted marketing techniques, turns traditional notions of financial literacy and inclusion on theirhead. For years, consumers have been encouraged to seek information on financial products and to comparison shop.But those directives are undermined by targeted marketing; if the content that consumers see is determined by what afirm knows about them, it is not possible for them to select from among the full range of products and/or pricesavailable online. Thus, even consumers who seek out information to make informed decisions may be thwarted frommaking the best choices for themselves or their families and instead may be subject to digital redlining orsteering.

The growing prevalence of AI-based technologies and vast amounts of available consumer data raises the risk thattechnology could effectively turbocharge or automate bias. In doing so, we risk further entrenching pastdiscrimination into future decision-making. In other words, whereas in the past, an individual’s conscious orunconscious bias may have resulted in discrimination, in the future, these biases may be carried out by algorithms,in effect automating discrimination. Although AI and ML have promise, the potential to use increasinglydetailed data about consumers to either purposefully or unwittingly automate forms of discrimination is very real.Given these risks, targeted marketing efforts used to advertise credit products should be carefully reviewed, aswill be discussed more in the next section.

IMPLICATIONS FOR FINANCIAL INSTITUTIONS

Although our knowledge of targeted Internet-based marketing practices, as well as the technology animating thepractices themselves, is evolving, financial institutions nonetheless can address some of the risks of redlining andsteering that such marketing may raise. For example, lenders can ensure that they understand how they are employingtargeted, Internet-based marketing and whether any vendors use such marketing on their behalf. As the HUDdiscrimination charge against Facebook illustrates, advertising filters that exclude predominantly minorityneighborhoods or groups of individuals based on a prohibited characteristic or another trait that is correlated witha prohibited characteristic raise fair lending concerns and could result in legal violations.

Lenders that use online advertising services or platforms can take steps to ensure that they monitor the terms usedfor any filters, as well as any reports they receive documenting the audience(s) that were reached by theadvertising. It is also important to understand whether a platform employs algorithms — such as the ones HUDalleges in its charge against Facebook — that could result in advertisem*nts being targeted based onprohibited characteristics or proxies for these characteristics, even if that is not what the lender intends.

Despite how new the technology may be, many of the tools to address fair lending risks in the offline world may bemodified to mitigate risks in the evolving online world. For example, to mitigate redlining risk, lenders canclosely review any geographic filters in use and include the monitoring of all marketing and outreach activities aspart of their larger fair lending risk management programs.53

To mitigate steering risks, practices developed by brick-and-mortar lenders offering prime and subprime productsthrough different channels may be helpful for lenders employing complex online marketing strategies. For example,lenders can ensure that, when a consumer applies for credit, she is offered the best terms she qualifies for,regardless of what marketing channel or platform was used to target marketing to the consumer or collect herapplication. By taking these and other steps, lenders and others who advertise credit products can work to ensurethat technology is deployed in consumer financial services in ways that are consistent with a commitment to fairlending.

CONCLUSION

Unlike the democratizing effect of the Sears catalog, targeted marketing may constrain consumers’ access to thebroad range of products and services available today. By making assumptions about what products might be the rightfit for consumers, targeted marketing has an increasingly significant, though largely invisible, impact on theadvertisem*nts shown to consumers online. In the context of credit, without careful implementing andmonitoring, Internet-based targeted marketing may undermine financial inclusion if a consumer is not shown the fullrange of financial products and services for which she could qualify.

Technological innovation has played an important role in expanding access to consumer credit in the past and cancontinue to do so. Yet, the well-documented and persistent gaps in wealth and income between people of differentraces and ethnicities is a reminder of the high stakes that fair access to credit opportunities has for manyconsumers, especially minorities.54 Thus, thoughtful design and monitoring of technologies that rely on consumerdata are critical to guard against the risk that the volume and granularity of these data will lead to uses thatautomate human biases and calcify the legacy of past discrimination. The marketing of housing and credit products inparticular carries obligations under the ECOA and the FHA. As a result, the use of technology reliant onconsumer data for this type of marketing should be approached with an awareness of the risks that any selectedtechnologies bring.

While the manner in which consumers access financial products and services has changed dramatically since the days ofpost office orders from the Sears catalog, financial institutions and their regulators need to ensure the underlyingbedrock principles of consumer inclusion and fairness remain timeless.

1 Louis Hyman, “Opinion: How Sears Helped Oppose Jim Crow,” New York Times (October 20, 2018).

2 See Endnote 1.

3 Gaby Del Valle, How the Sears Catalog Transformed Shopping Under Jim Crow, Explained by a Historian, Vox (October 19, 2018).

4 See Endnote 1.

5 Lisa Kramer and Scott Smith, “Can Robo Advisors Replace Human Financial Advisors?Wall Street Journal (February 28, 2016).

6 See, e.g., “The Use of Cash-Flow Data in Underwriting Credit: Empirical Research Findings,FinRegLab (July 2019); “Leveraging Alternative Data to Extend Credit to More Borrowers,” FICO Blog, (May 22, 2019) (accessed July 12, 2019); Alternative Data: The Key to Expanding the Credit Universe, Experian (July 1, 2019). In December 2019, the Federal Reserve, along with other federal agencies, released an interagency statement on the use of alternative data in credit underwriting. Interagency Statement on the Use of Alternative Data in Credit Underwriting (December 3, 2019).

7 Jason Morris and Ed Lavendara, “Why Big Companies Buy, Sell Your Data,” CNN (August 23, 2012).

8 See “Cross-Site Tracking: Let’s Unpack That,” The Firefox Frontier (April 12, 2018).

9 See Endnote 8.

10 Simon Hill, Computing: How Much Do Online Advertisers Really Know About You? We Asked an Expert,” Digital Trends (June 27, 2015).

11 See Endnote 10.

12 See Endnote 10.

13 Stuart A. Thompson, “These Ads Think They Know You,” New York Times (April 30, 2019).

14 Natasha Singer, “Secret E-Scores Chart Consumers’ Buying Power,” New York Times (August 18, 2012);see also Led Astray: Online Lead Generation and Payday Loans,” Upturn (October 2015).

15 See “Led Astray,” Endnote 14, p. 5.

16 See Singer, Endnote 14.

17 Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (New York: Crown Random House, 2016).

18 See Singer, Endnote 14.

19 See Singer, Endnote 14.

20 See Singer, Endnote 14.

21 See Singer, Endnote 14.

22 See Endnote 13.

23 See Endnote 13 (quoting Chat Engelgau of Acxiom).

24 See Thompson, Endnote 13

25 See 15 U.S.C. §§1691-1691f.

26 See 12 C.F.R. §1002.4(b) (2019). The question of whether targeted marketing would violate the ECOA has been raised before. For example, a January 2016 report issued by the Federal Trade Commission raised the question of whether targeted marketing of credit products, such as credit cards, using data and algorithms would violate the ECOA. The report advised caution. FTC, Big Data: A Tool for Inclusion or Exclusion? (2016).

27 See 42 U.S.C. §§3601-3619.

28 Under the FHA, it is unlawful to “make, print, or publish or cause to be made, printed or published any notice, statement, or advertisem*nt, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin, or an intention to make any such preference, limitation,or discrimination.” 42 U.S.C. §3604(c); see also 24 C.F.R. §100.75(c)(3) (2019) (noting, for example, that discriminatory advertisem*nts may include selecting media that make advertisem*nts unavailable on the basis of a prohibited characteristic). Moreover, the Interagency Fair Lending Examination Procedures further highlight marketing-related factors that may raise fair lending risk. See Interagency Fair Lending Examination Procedures (Interagency Procedures) (2009), pp. 11–12.

29 See United States v. SunTrust Mortgage, Inc., No. 3:12-cv-00397-REP (E.D. Va., Consent order filed September 14, 2012); United States v. Countrywide Home Loans, Inc., No. 11-cv-10540 (PSG) (C.D. Cal., Consent order filed December 28, 2011); United States. v. PrimeLending, No. 3:10-cv-02494-P (N.D. Texas, Consent order filed January 11, 2011).

30 The Supreme Court affirmed the availability of a disparate impact theory under the Fair Housing Act in Texas Dept. of Housing & Community Affairs v. Inclusive Communities Project, Inc., 135 S.Ct. 2507 (2015). It ruled that the Fair Housing Act permits liability under a disparate impact theory, prohibiting policies that seem to be neutral on their face but have a disparate impact on a protected class.

31 See Regulation B, 12 C.F.R. §1002.6, Comment 1002.6(a)-2; CFPB Bulletin 2012-04 (Fair Lending) (April 18, 2014). Notably, the Department of Housing and Urban Development (HUD) has proposed a rule regarding the standard for a discrimination claim alleging disparate impact. This rule is subject to the formal rulemaking process, but if enacted, would have an effect on the requirements to prove a disparate impact claim.

32 Both the ECOA and the Fair Housing Act prohibit redlining. The ECOA and Regulation B prohibit discrimination in any aspect of a credit transaction. 15 U.S.C. §1691; 12 C.F.R. §1002.4(a). The Fair Housing Act also prohibits discrimination in making available or in the terms or conditions of the sale of a dwelling on the basis of race or national origin, and further prohibits businesses engaged in residential real estate-related transactions from discriminating against any person in making available or in the terms or conditions of such a transaction on a prohibited basis. 42 U.S.C. §3604(a) and (b), 3605(a); 24 C.F.R. §100.120(b). See Interagency Fair Lending Examination Procedures, pp. 29–30.

33 The broad protections of the ECOA and the Fair Housing Act also prohibit steering. See Endnote 32; see also Interagency Fair Lending Examination Procedures, p. 24.

34 See Interagency Procedures, Endnote 28, p. 24.

35 For example, discouraging prospective applicants because of a protected characteristic is also prohibited. Regulation B provides that “[a] creditor shall not make any oral or written statement, in advertising or otherwise, to applicants or prospective applicants that would discourage on a prohibited basis a reasonable person from making or pursuing an application.” 12 C.F.R. §1002.4(b); see also Comment 1002.4(b)-1.

36 Joint statement from Facebook, NFHA, CWA, ECBA, O&G, and ACLU, Summary of Settlements Between Civil Rights Advocates and Facebook (March 19, 2019). The New York Department of Financial Services is also investigating Facebook’s advertising platform. See press release, N.Y. Department of Financial Services, Governor Cuomo Calls on DFS to Investigate Claims that Advertisers Use Facebook Platform to Engage in Discrimination (July 1, 2019). But even before HUD’s charge against the social media company, several other fintech and big data reports also highlighted these risks. See, e.g., Jennifer Valentino-Devries & Jeremey Singer-Vine, “Websites Vary Prices, Deals Based on Users’ Information,” Wall Street Journal (December 24, 2012). and Aniko Hannak et al., “Measuring Price Discrimination and Steering on E-Commerce Web Sites” (November 2017).

37 Julia Angwin and Terry Parris Jr., “Facebook Lets Advertisers Exclude Users by Race,” ProPublica (October 28, 2016); Julia Angwin, Ariana Tobin, and Madeleine Varner, “Facebook (Still) Letting Housing Advertisers Exclude Users by Race,” ProPublica (November 21, 2017).

38 In July 2019, in separate cases, the U.S. Equal Opportunity Commission found “reasonable cause” to believe that seven employers violated civil rights laws by excluding women or older workers or both from seeing ads for employment those employers posted on Facebook. See Ariana Tobin, “Employers Used Facebook to Keep Women and Older Workers from Seeing Jobs Ads. The Federal Government Thinks That’s Illegal,” ProPublica (September 24, 2019).

39 An academic study of Facebook’s advertising platform found multiple mechanisms by which the social media company permitted advertisers to publish “highly discriminatory ads.” Till Speicher et al., Potential for Discrimination in Online Targeted Advertising, Proceedings of Machine Learning Research81:1–15 (2018).

40 See Joint Statement, Endnote 36.

41 HUD Charge of Discrimination, March 28, 2019.

42 Press Release, HUD, HUD Files Housing Discrimination Complaint Against Facebook (August 17, 2018).

43 See Endnote 42.

44 See Muhammed Ali et al., Cornell University, Discrimination Through Optimization: How Facebook’s Ad Delivery Can Lead to Skewed Outcomes (April 3, 2019).

45 See Endnote 44.

46 See Endnote 44.

47 See, e.g., Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism, (2018); Nicol Turner Lee et al., Brookings Institution, Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harm (May 22, 2019); Testimony of Nicol Turner Lee, Brookings Institution, before the U.S. House Committee on Financial Services, June 26, 2019. In her testimony, Turner Lee defined bias as “outcomes that are systematically less favorable to individuals within a particular group and where there is no relevant difference between groups that justifies such harms.”

48 Jeffrey Dastin, “Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women,” Reuters (October 9, 2018).

49 See Endnote 48.

50 See, e.g., AI Now Institute, New York University, AI Now Report 2018 24-27; Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (2019).

51 See, e.g., Robert Bartlett et al., “Consumer-Lending Discrimination in the FinTech Era” (May 2019) (generally concluding that algorithms, such as those used by fintech lenders, do not remove discrimination from lending decisions).

52 Emily Steel and Julia Angwin, “On the Web’s Cutting Edge, Anonymity in Name Only,” Wall Street Journal (August 4, 2010).

53 For a general discussion of how institutions may manage redlining risk, see Consumer Compliance Supervision Bulletin (July 2018), pp. 2–4.

54 See “Urban Institute, Nine Charts about Wealth Inequality in America” (October 5, 2017).

From Catalogs to Clicks: The Fair Lending Implications of Targeted, Internet Marketing (2024)
Top Articles
Meta abandoned crypto in 2022. Maxine Waters wants to know why the company is still filing patents for it
Angular vs React: Which framework to choose in 2023
123 Movies Black Adam
Avonlea Havanese
Metallica - Blackened Lyrics Meaning
Mr Tire Prince Frederick Md 20678
Dee Dee Blanchard Crime Scene Photos
Routing Number 041203824
Tamilblasters 2023
Synq3 Reviews
Hellraiser III [1996] [R] - 5.8.6 | Parents' Guide & Review | Kids-In-Mind.com
Craigslist In Flagstaff
TBM 910 | Turboprop Aircraft - DAHER TBM 960, TBM 910
Foxy Brown 2025
Moving Sales Craigslist
Nurse Logic 2.0 Testing And Remediation Advanced Test
Cbssports Rankings
BMW K1600GT (2017-on) Review | Speed, Specs & Prices
Reptile Expo Fayetteville Nc
Barber Gym Quantico Hours
Maxpreps Field Hockey
College Basketball Picks: NCAAB Picks Against The Spread | Pickswise
Ceramic tiles vs vitrified tiles: Which one should you choose? - Building And Interiors
Play Tetris Mind Bender
Rapv Springfield Ma
Craigs List Jonesboro Ar
Strange World Showtimes Near Savoy 16
Afni Collections
Safeway Aciu
TMO GRC Fortworth TX | T-Mobile Community
My Reading Manga Gay
Salemhex ticket show3
Craigslist Free Stuff San Gabriel Valley
Workboy Kennel
A Small Traveling Suitcase Figgerits
Peter Vigilante Biography, Net Worth, Age, Height, Family, Girlfriend
How to Watch the X Trilogy Starring Mia Goth in Chronological Order
Pitco Foods San Leandro
2008 Chevrolet Corvette for sale - Houston, TX - craigslist
Grapes And Hops Festival Jamestown Ny
Cranston Sewer Tax
Nsav Investorshub
Wilson Tattoo Shops
Gopher Hockey Forum
Gas Buddy Il
Online College Scholarships | Strayer University
Bismarck Mandan Mugshots
View From My Seat Madison Square Garden
Dmv Kiosk Bakersfield
Ippa 番号
Latest Posts
Article information

Author: Dean Jakubowski Ret

Last Updated:

Views: 5479

Rating: 5 / 5 (70 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Dean Jakubowski Ret

Birthday: 1996-05-10

Address: Apt. 425 4346 Santiago Islands, Shariside, AK 38830-1874

Phone: +96313309894162

Job: Legacy Sales Designer

Hobby: Baseball, Wood carving, Candle making, Jigsaw puzzles, Lacemaking, Parkour, Drawing

Introduction: My name is Dean Jakubowski Ret, I am a enthusiastic, friendly, homely, handsome, zealous, brainy, elegant person who loves writing and wants to share my knowledge and understanding with you.