On June 13, Florida Governor Ron DeSantis signed into law new legislation that opens the door to fully autonomous vehicles in a way no other state has. “A fully autonomous vehicle may operate in this state regardless of whether a human operator is physically present in the vehicle,” the law reads in no uncertain terms.

It also amends previous traffic rules to comport with a more modern idea of what “driving” will be in an autonomous future, such as exempting AVs from rules that, for example, ban using your phone or watching TV while the car is in motion.

While two other states, Michigan and Texas, allow so-called “people-free” vehicles, Florida’s law goes beyond that by prohibiting local regulations to differ from the state law, essentially turning all of the state’s public roads into one giant AV testing lab.

“They’re going to be treated like any other car,” says Dorothy Glancy, a law professor at Santa Clara University who specializes in transportation law. She says it is as if Florida is declaring bring your AVs to our state. We will not hassle you.

At the same time as Florida throws open its doors, the law is incredibly vague and leaves many important questions unanswered. Glancy called the law’s definitions, “great big marshmallows” due to their lack of specificity. Indeed, even the definition of Automated Driving System, the very technology the law is about, is squishy:

The hardware and software that are collectively capable of performing the entire dynamic driving task of an autonomous vehicle on a sustained basis, regardless of whether it is limited to a specific operational design domain.

Troublingly, the law does not define what constitutes “a sustained basis,” the only semblance of standards the law offers for what this AV technology needs to be capable of before having at Florida’s roads. In fact, it lowered the bar for AV operation so much that computers can now operate cars in Florida easier than humans, since us lowly carbon-based creatures must pass a driving test in order to drive one whereas a computer does not.

This could have major implications not just companies in the testing phase like Waymo, Uber, Aurora, and dozens of others, but also for existing automakers looking to make the jump from semi-autonomous driving features to fully-autonomous ones. It also has profound implications for what human drivers, cyclists, and pedestrians are subjecting themselves to when they decide to use Florida’s roads and sidewalks.

The law has curious implications particularly for Tesla, which has been the most bullish of any car company on self-driving futures and has the capability to push software updates that radically change every car’s functionality. Elon Musk infamously predicted a million Tesla robotaxis by next year, a timeline the rest of the industry nearly uniformly rejects.

Robotaxis aside, the law potentially opens the door for Tesla to push a software update to all its cars in Florida no longer requiring drivers to keep their hands on the wheel when Autopilot is engaged (which many of its drivers dangerously and negligently do not do anyways). According to the new law, as long as the autonomous vehicle can drive itself “on a sustained basis,” the car’s occupants are potentially exempt from the state’s driver safety laws including using their cell phones or even watching TV. It is far from clear if on-ramp-to-off-ramp, the official territory where a user can enable Autopilot, classifies as a “sustained basis.”

The potential safety implications could be deadly, as Florida well knows. In March, a Tesla Model 3 in Florida crashed into a semi-truck that was crossing a four-lane divided highway in the state. A preliminary National Transportation Safety Board investigation concluded Autopilot was engaged at the time of the crash. They also found the driver was not applying any torque to the wheel at the time of the crash, but that doesn’t necessarily mean his hands were not on the wheel.

In some ways, the new law may help clarify some confusion over who is responsible in cases where an automated system is engaged at the time of a crash. Should Tesla declare Autopilot an autonomous driving system, as the company so loves to suggest in marketing materials and public addresses, then the automated driving system is legally the operator of the vehicle whether there’s a human inside or not.

Of course, this comes with a big ol’ catch, for Tesla and any other potential AV operator. The operator of a vehicle is, at least in theory, responsible for making sure it doesn’t crash into anything and is responsible if it does. The law does stipulate that AV companies need to have the requisite insurance policies, just like any other car. Florida is a no-fault state.

But Glancy suggested this issue of fault and liability could get complicated. AVs are often a combination of software and hardware from several different manufacturers. The car brand may not be the company that made or wrote the code for what caused any given crash.

The Florida law is similarly vague on how traffic violations would work. AVs are subject to the same laws as every other car, but the logistics of actually enforcing them could get complicated. “You don’t put a car in jail. You don’t fine a car,” observes Bryant Walker Smith, assistant professor at the

University of South Carolina School of Law who studies legal issues surrounding autonomous cars. The same, of course, can be said for a computer program.

Smith says Florida is betting that, by signaling their friendliness to AV companies, it can attract those businesses to their state by giving them a fertile ground to run their product in real-world environments, even if the law probably didn’t change the legal status of AVs to begin with. In 2014, he scoured the 1949 Geneva Convention on Road Traffic, National Highway Traffic Safety Administration (NHTSA) regulations, and the vehicle codes of all fifty U.S. states before coming to the following conclusion, which was also the title of the paper born from his work: automated vehicles are probably legal in the United States.

Smith came to this conclusion based on a pretty simple premise: everything is legal until it’s not, and the law is decidedly vague on whether the driver of a car must be a human. Courts generally find that what is not explicitly illegal is therefore legal. As such, Smith struggled to find specific laws that could be construed as banning autonomous vehicles.

The fact that there are so many unanswered questions or gray areas is probably not an accident, Glancy suspects. Instead, it is emblematic of a different policy approach Florida is taking versus other states, such as California, Michigan, and Pennsylvania, which have created a legal framework for AVs to test their systems in limited areas, circumstances, or subject to a permitting process.

The thing about real-world environments is there are other people there, people who did not consent to being part of a science experiment. Glancy’s work, which has been replicated by other surveys by organizations like AAA, overwhelmingly demonstrates public skepticism or hostility to AVs. Most Americans don’t trust them or are outright afraid of them.

“People are going to die,” Glancy predicted, although she didn’t mean it as the dire warning it may sound. Tens of thousands of people are killed by vehicles every year in the United States, so it stands to reason AVs, which are ultimately still cars with fallible drivers, will kill people, too. The question is whether they will kill fewer people than human drivers, and if so how many fewer.

Up to now, the industry’s approach has mostly been caution (with a few notable exceptions). But Florida’s law indicates some states may start allowing a much higher degree of risk, particularly in exposing members of the public, and therefore likely deaths, in helping AV manufacturers improve their systems.

The charitable interpretation here is that, in order for the AV systems to get safer, they have to be exposed to some degree of real-world risk. It is a necessary condition of technological advancement in transportation. For example, Glancy likes to point out the mass casualties that resulted from early steam ship and railroad operations. Airplanes could be added to the list.

Further, it’s possible that AVs, for all their faults and imperfections, will on balance save more lives than they kill, because humans can be pretty terrible drivers. The problem is, that’s a purely hypothetical argument at this point, and there may be only one way to find out. It’s not that anyone wants masses of people to die so we can have more efficient travel, but that seems to be, for better or worse, how our society works. And there’s no opt-out of that.