Thursday, June 30, 2005
Notes on War of the Worlds
Steven Spielberg has actually provided us with an interesting case-study in comparative directing, in this case by remaking Signs. It turns out he's a lot better than M. Night Shyamalan (who was heralded as the next Spielberg, before he made any other movies), but it also painfully reminds us how awful Signs was. Spielberg's remake is much better, don't misunderstand me. But it is the same movie.
When I heard complaints about the ending, I assumed that meant they changed it from the original ending, as they seem to like to do in remakes these days, but no, indeed it is the familiar ending. I guess people don't really know how the original ends. Of course, in this day and age, it might be much harder to buy such an ending; the alien technology certainly requires an understanding of the atom, and it seems unlikely that they'd miss microbes on their way to investigating the atom, and similarly unlikely that their species evolved without stopping off at the single-cell stage, so that they might be fully unaware of such a thing. Also, Spielberg's need for happy endings is a disease. When an annoying character does his billionth unbelievably stupid thing and dies, you know he didn't really die. I would have liked an explanation for how he managed to save his unbelievably stupid ass from the totally unsurvivable fate that befalls the entire regiment he's right next to, though. At least the stepdad didn't die, resparking the romance between Cruise and his ex-wife. Ugh.
Anyway, other stupid things: An EMP knocks out every electronic device near the lightning storm in the beginning. Including his wind-up watch. OK, that's stupid. But then just two minutes later we see a shot of someone using a camcorder to film the tripod. How did that happen to survive? The plane wreckage was totally unrealistic, and there's no way a house would survive the flaming jetfuel of a 747 falling on it. Nor would the engine still be spinning the next day, not to mention that the plane shouldn't have even been flying that night, given what we know about US emergency flight policy after 9/11. So yeah, there's not a lot of following the rules in this movie.
So it's pretty similar to Signs. Why is it somewhat better? Well, it has some more action (although from the ads I wasn't expecting half the movie to be spent in a basement). Spielberg has a genius for intense scenes in thrillers like this, and several are quite well done. There's nothing like the kitchen scene in Jurassic Park, but the honk sound that the tripods emit is fantastic, and sounds like alien death.
And then that ending, though.
When I heard complaints about the ending, I assumed that meant they changed it from the original ending, as they seem to like to do in remakes these days, but no, indeed it is the familiar ending. I guess people don't really know how the original ends. Of course, in this day and age, it might be much harder to buy such an ending; the alien technology certainly requires an understanding of the atom, and it seems unlikely that they'd miss microbes on their way to investigating the atom, and similarly unlikely that their species evolved without stopping off at the single-cell stage, so that they might be fully unaware of such a thing. Also, Spielberg's need for happy endings is a disease. When an annoying character does his billionth unbelievably stupid thing and dies, you know he didn't really die. I would have liked an explanation for how he managed to save his unbelievably stupid ass from the totally unsurvivable fate that befalls the entire regiment he's right next to, though. At least the stepdad didn't die, resparking the romance between Cruise and his ex-wife. Ugh.
Anyway, other stupid things: An EMP knocks out every electronic device near the lightning storm in the beginning. Including his wind-up watch. OK, that's stupid. But then just two minutes later we see a shot of someone using a camcorder to film the tripod. How did that happen to survive? The plane wreckage was totally unrealistic, and there's no way a house would survive the flaming jetfuel of a 747 falling on it. Nor would the engine still be spinning the next day, not to mention that the plane shouldn't have even been flying that night, given what we know about US emergency flight policy after 9/11. So yeah, there's not a lot of following the rules in this movie.
So it's pretty similar to Signs. Why is it somewhat better? Well, it has some more action (although from the ads I wasn't expecting half the movie to be spent in a basement). Spielberg has a genius for intense scenes in thrillers like this, and several are quite well done. There's nothing like the kitchen scene in Jurassic Park, but the honk sound that the tripods emit is fantastic, and sounds like alien death.
And then that ending, though.
Sunday, June 12, 2005
Apple and Intel
You might already think that Apple's decision to use Intel chips is a good idea, but I think it's a great idea.
Markets, as they grow and mature, tend to get more specialized. It used to be, you could get a Model T in any color, as long as it was black. Today's car market caters to such specialized tastes as "small-cab pickups" and "luxury sports sedan." So it is becoming for computing. The days of the do-it-all machine are disappearing, and we're moving towards more devices and more specialized devices. There's no way that a single chip can provide the outstanding floating point performance that games want, as well as the dynamic reordering and so forth that a mainstream PC needs to run older code efficiently. Not at a decent price/heat-level/whatever. Choices have to be made, and it makes sense for floating point monsters to be made for game machines at the expense of other factors. The PC is being dismantled into smaller pieces and moved towards a greater variety of more specialized configurations; there'll be cheap game consoles, expensive workstations for content creation, cheap media recording/receiving/playback machines, cheap machines for internet connectivity, and so forth. It's a natural process, and we'll probably end up with much better products for it.
Well, pity poor Apple. They were one of the first to see this trend and a few years ago, Jobs told us his strategy: make the Mac the "digital hub" of the home, connecting and distributing media and so forth to TVs and stereos. They've made some great progress on this end, with their digital music and AirPort lines, and so forth. I expect some more good stuff to come from them for video and movies.
However, a few weeks ago I wrote about the competitive threat that the new video game consoles were going to present to Apple. They too want to be digital hubs of the home, and they have the content that's going to drive people to buy them up and connect them to TVs and stereos. iTunes and AirPort Express is great, but those PS3s and Xbox2s are going to present real problems for Apple, because they don't have any games to get people to connect up with their stack. Some people might go for hybrid solutions, but it's obvious that people will only favor one solution. I've also suggested that Apple team up with one of the obvious winners (Sony or Microsoft), or find someone else who is being left out in the cold and combine forces with them, like Nintendo.
But Apple found someone even better: Intel. With the next wave of home entertainment devices, Microsoft, Sony, and IBM are all winning, and even Apple is better positioned than Intel to profit from this trend. Now, with Apple, Intel has a vector into the home, and Jobs did hint that they had other consumer electronics devices planned with Intel. Major PC manufacturers, like Dell and Gateway have attempted to move into this space by selling flat-panel TVs, and other devices, but that's clearly not going to be where the value is. If Intel is smart, they'll treat Apple very well, and not just as another relatively low-volume PC-maker.
So, in concrete terms, I don't know what can go on here. I suspect that by moving to Intel chips, Apple has greatly improved the chances of people buying their machines. Windows will probably run on the new Macs, though Apple has said it won't work out of the box, and game developers seem to see moving to another platform with the same chips as easier than moving to a different platform with different chips. Perhaps Apple will even be able to use some VMWare or Wine-like technology to get decent game emulation performance for specific games. I'm sure they've explored that, but I don't think that's a likely scenario.
They've got a big hungry giant they can work with, and if they're both smart, they'll find ways to get in on the big living room opportunity that Microsoft and Sony are fighting over. I think this is a likely happening, because Intel can also see what's going on, they see that they are being left out of the party, and they know that the average PC manufacturer's idea of an innovative consumer product is putting their logo on a flat-panel TV. Apple is a company that can make this happen for them. Even without the games, Apple knows users and knows user interfaces. It's not impossible that Apple's offerings will be so far superior to Sony's and Microsoft's that most users will use Apple's offerings for everything that isn't games, very much like the fact that the PS2 is a crappy DVD player has resulted in most people having a separate DVD player in addition to their PS2.
Why not AMD: Great chips, great performance, great prices, absolutely doesn't address either of the main reasons for Apple's switch [very well]: notebook chips and supply reliability. That's about all there is to say about AMD. Maybe their notebook line will shape up, but they've got a long way to go on being a reliable chip supplier. If you're gonna switch, might as well solve that headache for good, and the deal will be sweeter if you go straight there.
Markets, as they grow and mature, tend to get more specialized. It used to be, you could get a Model T in any color, as long as it was black. Today's car market caters to such specialized tastes as "small-cab pickups" and "luxury sports sedan." So it is becoming for computing. The days of the do-it-all machine are disappearing, and we're moving towards more devices and more specialized devices. There's no way that a single chip can provide the outstanding floating point performance that games want, as well as the dynamic reordering and so forth that a mainstream PC needs to run older code efficiently. Not at a decent price/heat-level/whatever. Choices have to be made, and it makes sense for floating point monsters to be made for game machines at the expense of other factors. The PC is being dismantled into smaller pieces and moved towards a greater variety of more specialized configurations; there'll be cheap game consoles, expensive workstations for content creation, cheap media recording/receiving/playback machines, cheap machines for internet connectivity, and so forth. It's a natural process, and we'll probably end up with much better products for it.
Well, pity poor Apple. They were one of the first to see this trend and a few years ago, Jobs told us his strategy: make the Mac the "digital hub" of the home, connecting and distributing media and so forth to TVs and stereos. They've made some great progress on this end, with their digital music and AirPort lines, and so forth. I expect some more good stuff to come from them for video and movies.
However, a few weeks ago I wrote about the competitive threat that the new video game consoles were going to present to Apple. They too want to be digital hubs of the home, and they have the content that's going to drive people to buy them up and connect them to TVs and stereos. iTunes and AirPort Express is great, but those PS3s and Xbox2s are going to present real problems for Apple, because they don't have any games to get people to connect up with their stack. Some people might go for hybrid solutions, but it's obvious that people will only favor one solution. I've also suggested that Apple team up with one of the obvious winners (Sony or Microsoft), or find someone else who is being left out in the cold and combine forces with them, like Nintendo.
But Apple found someone even better: Intel. With the next wave of home entertainment devices, Microsoft, Sony, and IBM are all winning, and even Apple is better positioned than Intel to profit from this trend. Now, with Apple, Intel has a vector into the home, and Jobs did hint that they had other consumer electronics devices planned with Intel. Major PC manufacturers, like Dell and Gateway have attempted to move into this space by selling flat-panel TVs, and other devices, but that's clearly not going to be where the value is. If Intel is smart, they'll treat Apple very well, and not just as another relatively low-volume PC-maker.
So, in concrete terms, I don't know what can go on here. I suspect that by moving to Intel chips, Apple has greatly improved the chances of people buying their machines. Windows will probably run on the new Macs, though Apple has said it won't work out of the box, and game developers seem to see moving to another platform with the same chips as easier than moving to a different platform with different chips. Perhaps Apple will even be able to use some VMWare or Wine-like technology to get decent game emulation performance for specific games. I'm sure they've explored that, but I don't think that's a likely scenario.
They've got a big hungry giant they can work with, and if they're both smart, they'll find ways to get in on the big living room opportunity that Microsoft and Sony are fighting over. I think this is a likely happening, because Intel can also see what's going on, they see that they are being left out of the party, and they know that the average PC manufacturer's idea of an innovative consumer product is putting their logo on a flat-panel TV. Apple is a company that can make this happen for them. Even without the games, Apple knows users and knows user interfaces. It's not impossible that Apple's offerings will be so far superior to Sony's and Microsoft's that most users will use Apple's offerings for everything that isn't games, very much like the fact that the PS2 is a crappy DVD player has resulted in most people having a separate DVD player in addition to their PS2.
Why not AMD: Great chips, great performance, great prices, absolutely doesn't address either of the main reasons for Apple's switch [very well]: notebook chips and supply reliability. That's about all there is to say about AMD. Maybe their notebook line will shape up, but they've got a long way to go on being a reliable chip supplier. If you're gonna switch, might as well solve that headache for good, and the deal will be sweeter if you go straight there.
Thursday, June 02, 2005
Profiles in Horrible Leadership Skills, with Miguel de Icaza
Shawn pointed out to me last night that I really have it in for Miguel de Icaza. I just can't help it. He's just such a horrible leader, and I figured I'd write about it, since, you know, it's not like any Gnome people are gonna write about it and let the people who couldn't watch the speeches know what was being said and decided about the future of Gnome.
I watched his keynote GUADEC speech last night, and it was unbelievable. His speaking style is best described as "unrehearsed." Most of the speech seemed like a long ramble based on a few slides, which he made up as he went along. There was no vision for the future of Gnome presented, aside from some vague statements about the need for better usability testing.
Other great moments in leadership:
For contrast, look at Jeff Waugh's "10x10" talk, which at least judging by Planet Gnome, has gotten developers excited. Waugh was the person at the conference who articulated a vision for Gnome. His vision was to have Gnome achieve 10% global desktop marketshare by 2010. He probably knows as well as everyone else that that won't happen.
Still, his talk was a creative, well-spoken list of suggestions for how to get there: why aren't we taking full advantage of our "friends in high places," like Google, Novell, Red Hat, HP, Sun, and the trade press? Why aren't we engaging hardware vendors to see what it would take for them to ship Gnome pre-installed on their computers (in certain configuratios), so that we can work towards that? Why aren't we engaging independent software vendors to make them feel more like their applications are a part of Gnome even if they don't ship in the Gnome distribution? Why aren't we usability-testing our APIs instead of just our user interfaces? Why do we discourage software from Gnome being ported and distributed on Windows, which helps users move to our platform incrementally, and gets them excited about it? Why do we insist on shipping software that we think is abstractly elegant, but our end-users overwhelmingly tell us they hate? Why don't we see keeping our language bindings up to date as part of our job?
All of those suggestions would greatly improve the Gnome project, whether or not they meet their 10x10 goals. As Thoreau said, "In the long run, men hit only what they aim at. Therefore, they had better aim at something high." That is the role of a leader, telling them what to aim at, and how to hit it.
I watched his keynote GUADEC speech last night, and it was unbelievable. His speaking style is best described as "unrehearsed." Most of the speech seemed like a long ramble based on a few slides, which he made up as he went along. There was no vision for the future of Gnome presented, aside from some vague statements about the need for better usability testing.
Other great moments in leadership:
- He asked the audience how many people were using MacOS X, and after some people raised their hands, spat out "What are you doing here? You're at the wrong conference!" He was kidding, I think, but if you watch the speech, he wasn't being sufficiently cheerful for that to come across. In any case, really bad idea to imply, even jokingly, the non-Gnome developers who went to Stuttgart to see him speak are not welcome in the community.
- He openly stated that he was hoping that Gnome would adopt his company's Mono technology, and screw his competitors (Red Hat, Sun), who otherwise provide just as much funding and manpower (I would guess) to the Gnome project.
- Someone mentioned Java, and although both sides of the exchange aren't audible, he replied that (in effect) we don't have to think about Java in Gnome because no one is using it. When asked how many people used Java versus .Net, de Icaza said that one study said there were more .Net programmers than Java programmers, and another study said that there were twice as many Java programmers as .Net programmers. Of course, he preferred the former study. Clearly he prefers Gnome to have a large tent.
For contrast, look at Jeff Waugh's "10x10" talk, which at least judging by Planet Gnome, has gotten developers excited. Waugh was the person at the conference who articulated a vision for Gnome. His vision was to have Gnome achieve 10% global desktop marketshare by 2010. He probably knows as well as everyone else that that won't happen.
Still, his talk was a creative, well-spoken list of suggestions for how to get there: why aren't we taking full advantage of our "friends in high places," like Google, Novell, Red Hat, HP, Sun, and the trade press? Why aren't we engaging hardware vendors to see what it would take for them to ship Gnome pre-installed on their computers (in certain configuratios), so that we can work towards that? Why aren't we engaging independent software vendors to make them feel more like their applications are a part of Gnome even if they don't ship in the Gnome distribution? Why aren't we usability-testing our APIs instead of just our user interfaces? Why do we discourage software from Gnome being ported and distributed on Windows, which helps users move to our platform incrementally, and gets them excited about it? Why do we insist on shipping software that we think is abstractly elegant, but our end-users overwhelmingly tell us they hate? Why don't we see keeping our language bindings up to date as part of our job?
All of those suggestions would greatly improve the Gnome project, whether or not they meet their 10x10 goals. As Thoreau said, "In the long run, men hit only what they aim at. Therefore, they had better aim at something high." That is the role of a leader, telling them what to aim at, and how to hit it.