It is a sorry statement, but not entirely unexpected: Federal Election Commission Chairman Sean Cooksey recently wrote a letter to the Federal Communications Commission Chairwoman Jessica Rosenworcel attempting to discourage her agency from pursuing regulations requiring disclosures of artificial intelligence content in campaign ads on TV and radio.

Cooksey’s disparaging letter reads a little like a turf war between two independent agencies. Still, that viewpoint can pretty well be dispelled when looking at the events leading up to the letter.

Extraordinarily rapid advances in artificial intelligence enable political operatives to produce computer-generated fake images and voices of candidates doing or saying something they never did. When the images and voices of candidates are fabricated, with the intent of causing harm to the candidate or otherwise deceiving voters, these are called “deepfakes.”

While AI has been around for a while, today’s deepfakes look and sound so realistic that they are virtually undetectable as fraudulent depictions that never really happened. Yet, the FEC — the agency responsible for enforcing fair and open campaign finance laws — does not require that sponsors of deepfakes disclose in their ads that the imagery and audio are not actual.

More than a year ago,  Public Citizen petitioned the FEC to issue transparency rules under the campaign finance law against “fraudulent misrepresentation.” The rules would require disclaimers in the ads explaining that the deepfakes are not real: that the candidates never did or said what is depicted.

Cooksey and his fellow Republican colleagues on the FEC declined to consider the petition under the guise of improper formatting. Public Citizen submitted a second petition a month later, following their formatting suggestions to the T. The agency had no choice but to accept the second petition for consideration at a later date.

Some 2,400 public comments were submitted to the FEC, almost all encouraging the agency to write disclosure rules on deepfakes before the likely deepfake-riddled election of 2024.

The agency dawdled and dawdled some more.

It now appears the FEC will vote on whether to begin rulemaking at the end of June. This delay makes it unlikely any disclosure requirement on deepfakes will be in place during the fall campaign.

It is not hard to imagine blockbuster deepfake videos falsely misrepresenting candidates will be released weeks before the election and go viral on social media, with little ability for voters to determine that the claims are fraudulent.

The price of democracy may be dear. Many voters are already suspicious of elections’ validity. If voters can no longer discern fact from fiction in campaign communications, the public’s confidence in the integrity of elections will be in further peril.

So, the FCC decided to step up to the plate. FCC chairwoman Rosenworcel recently announced plans to implement some disclosure rules before the 2024 election, at least for broadcast ads.

FEC Chairman Cooksey doesn’t like this. He wrote: “I am deeply troubled that you reportedly hope to have the regulations in place before the election. … As a result, your agency would be interfering with and undermining political campaigns and the election. I urge you and your FCC colleagues to delay … until after November 5, 2024.”

Cooksey intends to prevent the FEC from proceeding with rulemaking on deepfakes before the election. It remains to be seen whether the rest of the commission will overrule him.

Fortunately, a great deal of activity is happening in the states. Twenty states have seen the dangers of deepfakes in misinforming voters and have recently passed laws banning or requiring disclosure of deepfakes. Voters can only hope that Rosenworcel’s FCC continues to assert its independence and proceeds likewise with all due speed.

Granted, any agency rule, either by the FEC or the FCC, would be limited. The FEC’s fraudulent misrepresentation authority applies only to candidates, and the FCC’s authority applies only to broadcasters. But any federal transparency rules on deepfakes before the 2024 election would be a crucial beginning — and better than none.