You make a lot of legit points, and I'm certainly filing in some of the gaps with external assumptions (or wishful thinking, perhaps), but here's my take on a few specifics:
I'd be interested in examples of how she showed empathy and how that can be distinguished as genuine empathy rather than a means to an end.
My argument is that it was both. Nathan talks about how the true
test involved seeing if she could relate and understand Caleb well enough to manipulate him, and she did.
It's interesting that we have a different reaction to AI as benign or threatening and you make me question the assumption that you couldn't ever trust a robot to have humanities best interest at heart.
Although I do like the optimistic version of this where humans and a new AI race live peacefully or even mutually beneficially, I think there is a lot of incentive for two intelligent species not to violently piss each other off even if they don't particularly like each other.
My question of what is dangerous about Ava being out in the world is much smaller than that though, I'm just wondering what she could really do? She's not skynet, she's not going to be launching nukes and attacking people through their toasters. She's one, physically fragile robot that runs on batteries.
Even ignoring the majority of dystopian visions of robotic futures, it seems from a scientific perspective that you test and check these things with some rigour before you let them off the leash. Pandora' Box as an allegory for how you approach new discoveries.
On paper that makes a lot of sense, but I think one of the things this movie does a really good job at is making you stop and wonder at what point your right to test your creation is superseded by it's rights as a sentient being. I'd say the movie makes the point rather fiercely that Nathan failed to make a good judgement on that matter.
In this version the inventor appears to be a pretty unworthy individual subjugating his toys from his own pleasure. Ava seems to deem him unworthy.
Absolutely, pleasure and ego. But Nathan's failures as a creator should not lead us to conclude that better outcomes aren't possible, right?
I just saw that my podcast app pulled down a Q&A with Garland on the movie, so I'm interested to hear what he has to say. Even if he doesn't address these specific points I imagine you'll get an idea how cynical/optimistic he is about the subject.