Follow us on LinkedIn to see future News.
May 9, 2024
On April 29, 2024, the U.S. Department of Labor’s Wage and Hour Division (WHD) published Field Assistance Bulletin (FAB) No. 2024-1 addressing artificial intelligence and automated systems (AI) in the workplace under the Fair Labor Standards Act (FLSA) and other federal labor standards.[1] This FAB responds to Executive Order 14110, which President Biden issued in October 2023. The Executive Order called for a coordinated U.S. government approach to the responsible and safe development and use of AI. The Executive Order required the Secretary of Labor to issue guidance to make it clear that employers that use AI to monitor or augment employees’ work must continue to ensure compliance with the FLSA and other legal requirements. See our article about the Executive Order here.
Though the WHD recognizes the value to employers of AI to streamline tasks, improve efficiency and safety, and enhance accountability, it also warns that, without responsible human oversight, the use of AI poses potential federal labor standards compliance challenges.
Read on for the highlights of the FAB.
Hours Worked
Generally, the FLSA requires payment to covered employees of at least the federal minimum wage for each hour worked and at least one and one-half times their regular rate of pay for each hour worked in excess of 40 hours in a workweek. Employers must pay employees for all hours worked regardless of the employee’s level of productivity or performance.
Short breaks of 20 minutes or less taken during the workday generally are counted as hours worked; longer breaks during which an employee is completely relieved from duty are not hours worked.
Tracking work time
Some employers use AI to measure and analyze metrics that may indicate employee productivity, like computer keystrokes, mouse clicks, eye movements, website browsing, or activity in front of a web camera. However, such metrics are no substitute for the analysis of whether an employee performed “hours worked” under the FLSA
Monitoring break time
Employers often use timekeeping systems to assist in fulfilling their obligations of accounting for hours worked. Some systems now incorporate AI to make predictions and to auto-populate time entries based on a combination of prior time entries, regularly scheduled shift and break times, and other data. These predictions do not relieve employers of the responsibility to ensure that records accurately reflect breaks taken and that employees are properly compensated for all hours worked.
Waiting time
Some employers (e.g., hotels [for housekeeping workers] or warehouse and distribution centers) use AI to assign tasks and set work schedules. Use of these systems can lead to problems determining hours worked when an employee is waiting for their next task to be assigned or their schedule to be updated. If an employee is not given sufficient time for their own purposes, or is not completely relieved from duty, or is expected to remain nearby, or is not given a set time to report back to work, they generally are considered to be “engaged to wait.” The FLSA considers such periods to be hours worked.
Work at multiple locations
The location of work is not determinative of whether the employee has performed hours worked. A workday may be longer than the employee’s scheduled shift or their presence at a particular location.
Some employers use location-based monitoring to track employees and automate the clocking in and clocking out process. However, the use of such monitoring systems may create problems when they fail to account for work performed in different locations. For example, an employer may ask an employee to pick up equipment at the company’s headquarters on their way to the worksite. A system that records as compensable hours only the time the employee spent at the worksite may fail to account for all hours worked.
Calculating Wages Owed
Some AI systems use algorithms to determine employees’ rates of pay, which are then used to calculate overtime. Human oversight is necessary to ensure that such systems accurately calculate an employee’s regular rate of pay and overtime premium. The FAB provides the following example:
in the case of an employee who is paid multiple or different wage rates based on different metrics, the employer must ensure that the different rates are properly calculated into the regular rate of pay by adding together the employee’s total earnings for the workweek (including earnings from hourly rates, piece rates, or other compensation), and then dividing these earnings by the total number of hours the employee worked during that same week. The regular rate would then be divided in half, and then multiplied by the number of hours worked in excess of 40 to determine the employee’s total overtime premium compensation. Alternatively, where an employee is paid multiple rates, the employee and the employer may agree to compute overtime pay at one and one-half times the hourly rate in effect when the overtime work is performed.
Under the Family and Medical Leave Act (FMLA), eligible employees of covered employers are entitled to unpaid, job-protected leave for qualifying family and medical reasons.
Processing Leave Requests
Some employers use AI to process leave requests, including determining eligibility for leave, calculating the amount of leave available, or evaluating whether leave is for a qualifying reason. However, as discussed above under the FLSA section, AI may incorrectly count an employee’s hours worked, which may negatively impact FMLA eligibility (among other criteria, an employee is eligible for FMLA after 1,250 hours of service with an employer). Similarly, AI that incorrectly determines which days should be counted against an employee’s leave entitlement may lead to an improper denial of leave.
Medical Certifications to Support Leave
Employers may require an employee to submit a medical certification from a health care provider to support a leave request. The FMLA limits the amount of information for which an employer can ask, and an employer must provide the employee with at least 15 days to provide the requested information.
AI systems used to administer FMLA leave may seek disclosure of more medical information than the FMLA allows. AI systems that trigger penalties when an employee misses a certification deadline could violate the FMLA if the deadline should not have been imposed in the first place or if the employee was entitled to additional time to provide the requested certification.
The Providing Urgent Maternal Protections for Nursing Mothers Act (PUMP Act) amended the FLSA to give most nursing employees the right to reasonable break time and space to express breast milk at work.
AI that limits the length, frequency, or timing of a nursing employee’s pumping breaks would violate the FLSA. In addition, AI that monitors productivity and penalizes a worker for failing to meet productivity standards because they took pumping breaks also would violate the FLSA.
The Employee Polygraph Protection Act (EPPA) generally prohibits private employers from using lie detector tests on employees or to screen candidates for employment. However, in certain industries and under certain conditions, qualified examiners may administer polygraph tests.
Some AI technologies use eye measurements, voice analysis, micro-expressions, or other body movements to suggest if someone is lying or to detect deception. Such technologies would violate the EPPA unless used in accordance with the limited exemptions provided in the law.
The WHD warns that using AI to surveil a workforce for protected activity or to take adverse action against workers for engaging in protected activity under one or more laws that the WHD enforces constitutes unlawful retaliation. For example, using AI to detect, target, or monitor workers suspected to have filed a complaint with the WHD or to have cooperated with WHD investigators could constitute unlawful retaliation.
When you use AI responsibly, it can help you to improve compliance with the law. However, without responsible human oversight, AI can put you at risk. “The AI did it” is no defense to a violation of labor and employment laws.
This article is designed to provide one perspective regarding recent legal developments, and is not intended to serve as legal advice. Always consult an attorney with specific legal issues.
[1] On April 3, 2024, the U.S. Equal Employment Opportunity Commission (EEOC) issued a Joint Statement on Enforcement of Civil Rights, Fair Competition, Consumer Protection, and Equal Opportunity Laws in Automated Systems. The statement is similar to the WHD’s FAB in that it warns that, although AI offers the promise of advancement, its use also has the potential to “perpetuate unlawful bias, automate unlawful discrimination, and produce other harmful outcomes.” For the EEOC’s part, it referenced its two technical assistance documents on the issue of AI: one explains how the Americans with Disabilities Act applies to the use of AI to make employment decisions; and the other explains how the use of AI may lead to disparate impact under Title VII of the Civil Rights Act of 1964.