Saturday, May 10, 2008

Tilt my rotor

There's this thing that's starting to turn up on the news and the cover of Time magazine and movies like Transformers. Its the Bell-Boeing V-22 Osprey, and its the world's first operational tiltrotor. A tiltrotor is essentially a conventional aeroplane, except with normal engines replaced with oversized turboprops that can tilt up to vertical and act as helicopter rotors, thus making the thing capable of vertical take-off. Tiltrotors are- in theory at least- supposed to combine the flexibility of a helicopter with the range and speed of an aeroplane. I've got a feeling that some of the compromises involved reduce the actual performance to something less than spectacular. But I'll leave that for another post. Today I want to start from the beginning, and talk about one of the first tiltrotors, the Bell XV-3.

The XV-3 was the first experiment in tiltrotoring, being built in 1955 to iron out the kinks in the concept. The testing program was dominated by aeroelastic troubles, the combination of traditionally built wings and the relatively new art of helicopter rotor design creating some interesting maths at a time when people were still using analog computers. Playing around with the XV-3 solved the first round of problems with the tiltrotor concept, paving the way for the V-22- now seeing action with the Marine Corp in Iraq- and its in-development civilian counterpart, the BA609.

There is however, one vital difference between the XV-3 and the V-22. The XV-3 had a single engine located in the fuselage, driving the rotors through a long drive-shaft through the wings to the rotors at the tips, while the V-22 has the engines at the tips, tilting with the rotors. At first glance the tilting engines might seem like the better option. It sounds simpler, it eliminates the weight of the drive shaft, and after all it is found on the production model of a military aircraft. But here's the problem, the V-22 also has a drive-shaft between the rotors, so that if one engine fails the other can power both rotors. So suddenly the superiority of the tilt-engine-pod isn't so clear.

Mounting the engine on the wing tip and tilting it presents other problems too. The control wiring, and more importantly fuel piping (and probably some hydraulic lines too) have to pass through the pivot, crowding a lot of complex joints that add weight into a tight space. Cantilevering the engines off the wing creates structural problems. At least on land, the wing root has to take all that load, which means it has to be stronger, which means it has to be heavier. The majority of tiltrotors have been investigated by the military. Military rotor craft usually have some kind of armoured floor, to protect the cargo, passengers, and whatever else is in the fuselage. With tilt-engine-pods, that does not include the engines. With the engines mounted on the fuselage driving the rotors through drive shafts it does. Also, mounting the engines on the fuselage decreases the angular momentum of the craft in roll, providing a nice increase in maneuverability.

It is claimed that since the drive-shaft on a tilt-engine-podded craft is for emergency use only, it can be lighter, but such a drive-shaft still needs to carry sufficient power to keep the thing aloft, which is not inconsiderable. It is also worth mentioning that since the shaft only need operate for short periods (probably one half of a full range return journey), you can cut down a lot of the weight needed to give the thing a reasonable service life. I have to wonder though, whether the weight saving is as significant as is claimed.

So far I've failed to mention the 100-pound gorilla in the corner of the drive-shaft showroom, the horrible horrible vibrations. Having a spinning shaft running the length of a fairly flexible wing tends to shake the assembly to pieces in a pretty short length of time. This is probably why the tilt-pod-rotor configuration was chosen in the first place. But here's the thing, since the XV-3 testing was shut down, several things happened. Stiffer materials such as carbon fibre were developed, as was sophisticated computing capable of analysing vibrational modes in tiltrotors. Long drive-shafts have been used in other aerospace applications, such as helicopter tail rotors (the Sikorsky H-53 being a particularly high-powered and successful example), and I believe some of the tandem rotor choppers (like the Chinhook) have to use drive-shafts to power the front rotor.

So I have to wonder if the move away from centre mounted engines driving tip rotors through drive-shafts is the best possible direction for tiltrotor aircraft. I suspect that I may have oversimplified the above arguments a bit (I neglected mentioning gearboxes and swashplates on the assumption that they were the same for each type, which may not be the case), but I really think that centre mounted engines have a distinct advantage over tilt-engine-pods that is worth pursuing.
What is interesting to me is why centre mounted engines have been essentially forgotten for most tiltrotors. I suspect a large part of it is the protracted development of the tiltrotor in general which has been going on since the mid-fifties, but is only now seeing production. But a more significant if subtler effect lies in the fact that the tiltrotor concept was too far ahead of its time. The inability to properly analyse the vibration modes of the XV-3 led to the adoption of tilt-engine-pods in test aircraft-namely the XV-15, the precursor to the V-22- to simplify the engineering and results from testing. The development of fairly successful drive-shafts went unnoticed by the tiltrotor's designers, who were probably focused on other things. The interesting thing is that although it is only one aircraft, the V-22 has been in development for so long that it is now the face of tiltrotor aircraft, and so the tilt-engine-pod is seen as the defacto choice of engine placement. What happened then, was that the practical application of an idea grew not out of a thorough and complete analysis of all possible designs, but as an iteration of a well known design. That design was originally analysed as it was the easiest possible design to analyse. The V-22- a combat aircraft- shows the pedigree of a research aircraft. I wonder then if this is proof that the tiltrotor is undergoing proper evolution, rather than a simple progression of designs. The anatomy of the tiltrotor has retained those phenotypes that while not optimal, in the past provided an advantage, and so were selected.
I kind of cheated you here, because I want to talk about evolution, not aeroplanes. I think the really important thing to realise is that this needn't be so. Human beings possess both memory (of the XV-3) and some damn good computers, something that DNA does not. We are able to revisit and consider our failures, rather than carrying on blind to our history. Our flying machines therefore are not limited to random mutation and survival of the fittest, and should be all the better for it.

Thursday, May 8, 2008

Some Of My Favourite Things #1

(I said before I was going to do a couple of lists. I've now decided not to, since the Internet is mostly made up of lists and pictures of cats. So I thought instead I'd start an occasional feature about machines I really like for one reason or another. Here we go!)


NASA is going back to the Moon on a ship called Orion. Its basically an upgraded Apollo capsule, and it is not the first space ship called Orion. Back in 1958, General Atomics started work on Project Orion, a fiery old-testament spaceship propelled by nuclear bombs. The thing was supposed to weigh 4000 tonnes, 400 of which made up a pushed plate connected to the rest of the ship by 5-storey high nitrogen filled shock absorbers. The plate was there to absorb the focused blast from fusion boosted fission devices thrown out the back of the ship. The thing was ridiculously primitive, yet incredibly powerful. It is the only proposed spaceship capable of taking off from the Earth's surface and performing an interplanetary voyage with a single stage. A scaled down version to fit a Saturn V upper stage was also proposed. The fallout would have killed people, which is why it was ultimately never built (the Partial Test Ban Treaty of 1963 was the immediate reason), although a test of the pusher plate was performed with conventional explosives.

This is probably the most optimistic thing ever attempted by man. This was an interplanetary spacecraft that was supposed to be made from regular steel. The mass budget included a two tonne barber's chair. The missions were planned on the assumption that the General Atomics design team (which included physicist Freeman Dyson and bomb designer Ted Taylor) would be going along for the ride. This machine was a serious attempt to take the bombs that razed cities to the ground and use them to create for real the space opera adventures of the forties. It was Space Odyssey before Space Odyssey was even made.

Like most of spaceship proposals Orion was hopelessly uneconomic. By the time the designs were finally filed away, the only justification for its continued existence was the unfounded suspicion the the USSR was conducting a similar project. All that is left now are a few design projects under the much less alarmist heading of "pulsed plasma propulsion", and the General Atomics building- a cylindrical office the exact diameter of the 4000 tonne proposal.

Completely useless, absurdly dangerous, absolutely practical and God damned exciting.

References: Mostly taken from Project Orion: The True Story of the Atomic Spaceship by George Dyson. George is the son of Freeman, and interviewed first hand many of the people who worked for General Atomics. The book is a bit of a mess, but full of first hand accounts of an atomic spaceship.
There's also this: http://www.damninteresting.com/?p=679#more-679 which has a picture of the 4000t proposal,
and this: http://www.projectrho.com/rocket/index.html for all your atomic rocket needs.

Saturday, May 3, 2008

Explosions 'n' shit

I want to talk about CGI. I want to do this, because last year the highest grossing movies -the ones that the most people went and saw- were all CGI heavy, but none of them won an Academy Award. I figure this means that although 300, Spiderman 3, and Transformers were all fun (which is the point of movies), they didn't really make for compelling viewing, while Juno and No Country for Old Men did. Since this is notionally a technology blog, I'm going to try and figure out if CGI itself really has anything to do with this.

I'll start with the obvious, since a lot of people are saying that CGI is used as a substitute for proper plotting and character (I'm looking at you Michael Bay). Compare the original The Poseidon Adventure from 1972 with 2006's Poseidon. The former featured smallish sets and unconvincing model work, which left plenty room for Gene Hackman's struggle with God, Ernest Borgnine's embarrassment over marrying a prostitute, and Shelley Whithers' swimming. The latter killed off unlikeable characters in increasingly unlikely scenarios, sometimes by having them kill each other. It held the Guiness World Record for the most detailed CGI model in a film. What we have here is a clear example of too much CGI (and that model was used a lot) at the expense of drama.

I think it's time to put the lesson here in its most important terms: There is only so much money under the bed, and more CGI means less money for everything else. But I don't think this is strictly fair on CGI. The money is spent on CGI because people want CGI. I'm not going to argue about why, or even who (Its Michael Bay, mostly), because it seem pretty obvious that you could spend CGI money on other things, like real props, or real scripts, or real people. Like most everything, there can be too much CGI, and deciding how much CGI there is is as much a part of good filmmaking as casting or scripting.

I don't think that was a surprise to anyone, so I'll move on. To Revenge of the Sith. Just so you know, I have to try really hard not to mention all the reasons this movie sucked, and just mention one. You know in Empire Strikes Back when Vader tells Luke who he really is, and Luke is dangling off an aerial? You know how Han spends most of that movie up to his knees in wiring? Did you see that in Revenge of the Sith? No. Harrison Ford looked genuinely perturbed when the guts of the Millenium Falcon sparked in his face, probably because a chunk of pyrotechnics actually went off near his eyes. For most of Episodes II and III, Hayden Christensen and Ewan McGregor look at most mildly irritated, even when riding robots through lava for some reason.

The thing is, Ewan McGregor and Hayden Christensen couldn't see lava-robots, they could just see a giant green room. Which is why they act like they are in a giant green room, and debate betrayal and the nature of good and evil (or something), instead of worrying about the fucking lava.

This is a more fundamental problem than over-use. CGI can't actually be present during recording, so actors have nothing to react to. But this is not a new problem: actors could hardly react realistically to stop-frame animation dinosaurs back in the thirties, and they didn't have to. Model work was separated from the live action sets for reasons of necessity, which meant actors weren't really called on to react to things that weren't there. split screened effects changed the situation slightly, but there was never any interaction with physical objects that weren't in the room. So again the problem isn't with CGI itself, rather its use, which in Episode III was hardly judicious (I think at the end there is a CGI Peter Cushing). So there is a limit to what CGI can do well, but that's true of almost everything in film making. Deciding when to use CGI is also part of good filmmaking.

I think now its time to talk about a real problem with CGI. The 1933 King Kong was a pretty simple movie. It had straight-forward characters with few dimensions that didn't develop, and the plot consisted of a series of set pieces where a giant gorilla broke things, usually people. But to show the gorilla breaking things took hours and hours of stop-frame, slightly adjusting the plasticene models, taking a photo, then repeating: a second of film took about thirty photos. That's really hard, since it takes forever and needs to be really well coordinated. The 2005 King Kong was basically the same story, with some mild character development, and a shitload more action. Instead of stop-framing, the animators built an incredibly detailed CG model, including the underlying bone and muscle structure, as well as software to translate human expressions inot gorilla expression. This was then combined with motion capture from a suit worn by Andy Serkis. I have no idea what most of that means, since like most people I've never done anything like it. I have, however, made models, and know thirty photos is a lot for one second of footage. Here I think is the most important characteristic of CGI. People have no idea how it works, its just "all done with computers". The average movie-goer can't really relate to the effort put into coding, like they can with model making and taking loads and loads of photos. The incredible complexity of CGI also means that opening titles now say "Visual effects by Industrial Light and Magic" with the closing credits full of names that are too small to read. Back in the day there were no closing credits, and the titles said "Special effects by Willis O'Brien and Ray Harryhausen" and that was that. So if there is any problem with CGI itself, it is that it is difficult to relate to, since noone knows know who made it or how they did it.

So I hope I have made a good point here, that techinology only makes things better if you use it right, and that art, real art, is better if people know how hard it is to make. I mostly hope though, that noone actually makes a Macgyver Movie.