Monday, July 26, 2004
Notes on I, Robot
or, The Squandered Opportunities of I, Robot
This movie has always looked really bad. I've never even read the book, and it was obvious to me from the trailer that it was a major departure from what Asimov would have written in his book. So I was pretty surprised when I went and saw it anyways, and it wasn't as bad as I expected. Well, I take that back. In the ways it was bad, it was inline with forecasts, and I'll just associate myself with David Edelstein's full review in Slate. He's absolutely right. However, it did have some surprisingly interesting aspects, though, and I think it's a shame they weren't explored further.
According to IMDB, originally this movie wasn't based on the I, Robot stories, and was simply a futuristic murder mystery called "Hardwired" which took place in a similar setting. That probably would have been more appropriate for a project like this, but I think it helps to understand what I'm saying here if you keep in mind that the movie is forced into Asimov's imagined world, as opposed to thinking of it as a really, really horrible adaptation of what he wrote.
I also really thought that Smith's luddite leanings were dumb and unnecessary. First off, they were dumb, since he always trusts some other type of technology, just one that's less advanced than the one he's dealing with. He'll ride a motorcycle he didn't assemble, but doesn't like to trust robots. He'll drive a car manually at insane speeds, not trusting its computer to keep it in line with the computerized road markers. Philip K. Dick characters always find themselves out of place, yet they always feel comfortable in the world they inhabit, and don't question it (Who ever does? We only distrust technologies that seem foreign to us, but his characters grew up in the environments they inhabit). We in the audience, of course, are supposed to know that he's right. But really, up until the point when he is right, he's wrong. About everything. For apparently a huge chunk of his life. It's very strange when someone believes something so contrary to what reality seems to be for so long. It happens, of course (I'm sure we can all name names), but it certainly doesn't feel heroic.
Here's an example of something I thought was interesting, though. There's a scene where Will Smith is driving on the freeway, and some trucks filled with the robots start opening up and pouring attack robots onto his car. The first robot lands on his hood, punches through the windshield, and tries to grab his steering wheel to crash the car. However, the robot says in a very alarmed, concerned way, "You are having an accident!" That got a laugh out of the audience, and me as well, but I also thought it was really disturbing. In another scene, after deadlocking in a hand-to-hand fight with Will Smith, a robot hears sirens and runs at full speed to throw itself into some wreckage and fire. I think it would have been interesting if the movie had done more with how creepy it can be when technology (or really, anything) acts "confused" like that. Instead, the robots just get red lights in their chests and eyes and start ordering people around and attacking them. It's creepy in theory, but it didn't feel that way in the movie. Which is surprising to me, given what a masterfully creepy job Alex Proyas has done on other movies.
The movie finally got a little interesting when Will Smith finds the U.S. Robotics CEO murdered, and suddenly the movie's apparent villain up to that point turns out to have been innocent and naive. The true villain turns out to be interesting, a non-character up to that point: it was the AI system that they use to help design and coordinate all the robots. The system, in reflecting on the laws of robotics, realized that there is a sort of contradiction or vagueness in the first law. If they aren't supposed to harm humans, or through inaction allow humans to come to harm, then what if they could run the world better so that humans are safer? Thus, the computer decides to build in a central control mechanism into all the robots so that they can take over the world, to better protect humanity.
I thought this was a clever idea too, but the impact is blunted by having all the interesting directions this could go in instead diverted entirely into a stupid subplot involving this belief that computers can "evolve" or become conscious, or something. At several points throughout the movie, such mystical ideas are hinted at, and of course, they come off as half-assed and forced as they sound. It never goes anywhere; the mainframe announces that she has "evolved" and that is what led her to the conclusion she came to, and that's the end of that plot thread. But where did evolution come into play here? That strikes me as a pretty straightforward conclusion a computer could arrive at without needing to attain some sort of mystical consciousness. If you really can't give that type of subplot what it needs, it's probably better to just leave it out entirely, instead of letting it sit there, awkward and undeveloped.
The movie had a very weird camera style. In the final climactic moments in the mainframe's core, the camera zips in orbits around a beam Will Smith is standing on while he fights off robots. It's incredibly fast, and thus hard to read, but I thought it was interesting shot to attempt (didn't work, I think). Similarly, when Will Smith is in a house that is getting demolished by a gigantic (cool-looking) robot, the effects look terrible. Will Smith is obviously not really in the scene, and it really takes the drama off it. In fact, you get that feeling in most of the movie's big scenes. The edges of real actors seem to be blurred, so they blend into the background. (I'll spare you the required bit about how this all makes the movie "feel" fake, or whatever).
I remember a few years ago reading that Chris Cunningham wanted to make a far-future sci-fi movie that was filmed entirely in closeup, with blurred backgrounds. I, Robot is pretty much the exact type of movie that would have been helped out by that, not only because special effects still don't look realistic enough, but also because it might have forced them to look at some of the more interesting possibilities in this movie.
This movie has always looked really bad. I've never even read the book, and it was obvious to me from the trailer that it was a major departure from what Asimov would have written in his book. So I was pretty surprised when I went and saw it anyways, and it wasn't as bad as I expected. Well, I take that back. In the ways it was bad, it was inline with forecasts, and I'll just associate myself with David Edelstein's full review in Slate. He's absolutely right. However, it did have some surprisingly interesting aspects, though, and I think it's a shame they weren't explored further.
According to IMDB, originally this movie wasn't based on the I, Robot stories, and was simply a futuristic murder mystery called "Hardwired" which took place in a similar setting. That probably would have been more appropriate for a project like this, but I think it helps to understand what I'm saying here if you keep in mind that the movie is forced into Asimov's imagined world, as opposed to thinking of it as a really, really horrible adaptation of what he wrote.
I also really thought that Smith's luddite leanings were dumb and unnecessary. First off, they were dumb, since he always trusts some other type of technology, just one that's less advanced than the one he's dealing with. He'll ride a motorcycle he didn't assemble, but doesn't like to trust robots. He'll drive a car manually at insane speeds, not trusting its computer to keep it in line with the computerized road markers. Philip K. Dick characters always find themselves out of place, yet they always feel comfortable in the world they inhabit, and don't question it (Who ever does? We only distrust technologies that seem foreign to us, but his characters grew up in the environments they inhabit). We in the audience, of course, are supposed to know that he's right. But really, up until the point when he is right, he's wrong. About everything. For apparently a huge chunk of his life. It's very strange when someone believes something so contrary to what reality seems to be for so long. It happens, of course (I'm sure we can all name names), but it certainly doesn't feel heroic.
Here's an example of something I thought was interesting, though. There's a scene where Will Smith is driving on the freeway, and some trucks filled with the robots start opening up and pouring attack robots onto his car. The first robot lands on his hood, punches through the windshield, and tries to grab his steering wheel to crash the car. However, the robot says in a very alarmed, concerned way, "You are having an accident!" That got a laugh out of the audience, and me as well, but I also thought it was really disturbing. In another scene, after deadlocking in a hand-to-hand fight with Will Smith, a robot hears sirens and runs at full speed to throw itself into some wreckage and fire. I think it would have been interesting if the movie had done more with how creepy it can be when technology (or really, anything) acts "confused" like that. Instead, the robots just get red lights in their chests and eyes and start ordering people around and attacking them. It's creepy in theory, but it didn't feel that way in the movie. Which is surprising to me, given what a masterfully creepy job Alex Proyas has done on other movies.
The movie finally got a little interesting when Will Smith finds the U.S. Robotics CEO murdered, and suddenly the movie's apparent villain up to that point turns out to have been innocent and naive. The true villain turns out to be interesting, a non-character up to that point: it was the AI system that they use to help design and coordinate all the robots. The system, in reflecting on the laws of robotics, realized that there is a sort of contradiction or vagueness in the first law. If they aren't supposed to harm humans, or through inaction allow humans to come to harm, then what if they could run the world better so that humans are safer? Thus, the computer decides to build in a central control mechanism into all the robots so that they can take over the world, to better protect humanity.
I thought this was a clever idea too, but the impact is blunted by having all the interesting directions this could go in instead diverted entirely into a stupid subplot involving this belief that computers can "evolve" or become conscious, or something. At several points throughout the movie, such mystical ideas are hinted at, and of course, they come off as half-assed and forced as they sound. It never goes anywhere; the mainframe announces that she has "evolved" and that is what led her to the conclusion she came to, and that's the end of that plot thread. But where did evolution come into play here? That strikes me as a pretty straightforward conclusion a computer could arrive at without needing to attain some sort of mystical consciousness. If you really can't give that type of subplot what it needs, it's probably better to just leave it out entirely, instead of letting it sit there, awkward and undeveloped.
The movie had a very weird camera style. In the final climactic moments in the mainframe's core, the camera zips in orbits around a beam Will Smith is standing on while he fights off robots. It's incredibly fast, and thus hard to read, but I thought it was interesting shot to attempt (didn't work, I think). Similarly, when Will Smith is in a house that is getting demolished by a gigantic (cool-looking) robot, the effects look terrible. Will Smith is obviously not really in the scene, and it really takes the drama off it. In fact, you get that feeling in most of the movie's big scenes. The edges of real actors seem to be blurred, so they blend into the background. (I'll spare you the required bit about how this all makes the movie "feel" fake, or whatever).
I remember a few years ago reading that Chris Cunningham wanted to make a far-future sci-fi movie that was filmed entirely in closeup, with blurred backgrounds. I, Robot is pretty much the exact type of movie that would have been helped out by that, not only because special effects still don't look realistic enough, but also because it might have forced them to look at some of the more interesting possibilities in this movie.
Comments:
Hmm, it's interesting that the script was originally not meant to be "I, Robot". However, any story about a robot on trial borrows from Asimov (whether the author read Asimov or not). What turned me off from seeing it was the Slate article that said in the end the robots are the bad guys: the beauty of the stories was that it was always a puzzle, to figure out how the robots were actually doing the right thing. According to Slate, they seemed to have thrown that out of the window. (I did go and read the spoilers. Only true twist ending movies do I no want to see the spoilers: The Village I will avoid seeking spoilers for.) However, your article seems to show they did get Asimov right afterall (gee, Slate is slipping or something). It's just they give is a little post-9/11 twist: it's better to have your own choice and freedom at the cost of not being 100% protected.
BTW, very good and interesting points about you get used to what you grew up with. There are several old people in San Diego I've spoken to who just *hate* computers. This is really the first time in my life that I've seen that. When I'd ask further, it's only because they don't understand them. It's like when you learn to drive automatic-only that you start a campaign that sticks-suck or something. (On the otherside, if you drive sticks only, you tend to make automatics seem wimpy.)
BTW, very good and interesting points about you get used to what you grew up with. There are several old people in San Diego I've spoken to who just *hate* computers. This is really the first time in my life that I've seen that. When I'd ask further, it's only because they don't understand them. It's like when you learn to drive automatic-only that you start a campaign that sticks-suck or something. (On the otherside, if you drive sticks only, you tend to make automatics seem wimpy.)
"Who ever does? We only distrust technologies that seem foreign to us, but his characters grew up in the environments they inhabit"
Oh wait, but that doesn't actually explain the luddites we have today. Those who go with older equipment and become isolationists. Some of these people are already comfortable with our technology.
Oh wait, but that doesn't actually explain the luddites we have today. Those who go with older equipment and become isolationists. Some of these people are already comfortable with our technology.
Thank you!
[url=http://msrgannl.com/ylgy/ogsz.html]My homepage[/url] | [url=http://yvrpiqfr.com/nifw/xcgm.html]Cool site[/url]
[url=http://msrgannl.com/ylgy/ogsz.html]My homepage[/url] | [url=http://yvrpiqfr.com/nifw/xcgm.html]Cool site[/url]
Nice site!
[url=http://rnyqyjpn.com/mxwp/vyfi.html]My homepage[/url] | [url=http://neatnahr.com/ollx/mpgl.html]Cool site[/url]
Post a Comment
[url=http://rnyqyjpn.com/mxwp/vyfi.html]My homepage[/url] | [url=http://neatnahr.com/ollx/mpgl.html]Cool site[/url]