Before bending actual metal for a new Boeing aircraft, for instance, its designers ought to be able to feel what it will be like to sit in as a passenger, to fly it as a pilot, and to fix it as ground crew. Architects should be able to enter a building that exists only in their imagination and their software in order to see how light falls into it at noon in January and dusk in June. They should also be able to simulate the experience of people trying to get out of a building in a hurry if, God forbid, someone were to fly an aeroplane into it; to feel how it shakes in an earthquake, and so on.
If all this sounds like the visions of â€œvirtual realityâ€ long touted by science fiction and Hollywood, that is unfortunate but unavoidable. Ordinary people are already having the sort of experiences that Mr Bass describes, through the medium of online games such as â€œSecond Lifeâ€, which lets its visitors create anything they can imagine: with a few clicks, they can build houses, islands and spacecraft, and walk through or fly over the things created by other players.
To be useful to real-world engineers, however, Mr Bass thinks that virtual reality should stimulate as many of the five senses as possible. In software today, says Mr Bass, â€œwe’re at a pretty crude approximation of sight only.â€ Within a decade or so, he thinks, Autodesk should be able to model touch and hearing as well, although smell and taste will be harder. Designers, architects and engineers, by the sound of it, might soon be wearing wired gloves and full-body touch-suits.