How to Predict the Future of Technology

Creating the technologies of tomorrow requires developing a vision of how people will interact with computing power in 10 to 15 years.

Brian David Johnson

Brian David Johnson, Intel's Futurist

Brian David Johnson is Intel’s “futurist,” which means his job is to look out 10 to 15 years ahead and develop plans that Intel engineers can use to create technology for, well, the future. His job is a complicated mix of sociology and research, looking deeply into how people interact with computers and computation today to anticipate how it will evolve over time.

In a recent interview with Adrenaline, a glossy magazine for software developers published by Intel’s Software and Services Group, Johnson talked about his job, his role as a social scientist and the human element of design.

How do you go about projecting 10 to 15 years into the future?

We start with social science. We have, in our lab, ethnographers and anthropologists who go all over the world to study people and give us insights into human behavior — how humans communicate with each other, how humans live, how people interact with their governments, how they buy things and what their cars are like. Whatever you can think of, they are looking at it.

That gives us a basis — we have to remember that we are building products for us, for people. From there, I look at the computer science side of things: the people who are doing the innovative hardware and software development that goes on at Intel.

Next we ask, “What is possible with technology?” We look back at those human insights and ask, “OK, how do we make people’s lives better?”

Then I like to look at trends, what I call the math of the future. Most people start with population growth and the projections of where we are going. Although those are important to me, they aren’t as important as the first two steps — social science and computer science — because, again, we have to understand the people we are building for, and then we have to understand the technology that we are building.

In terms of computing, what do you see the future looking like in 20 years?

I am an incredible optimist for a number of reasons. Everything I do is based upon social science research. Usually when you talk to people about computers, devices and gadgets, they’re generally very optimistic. They think it is cool.

That is one of the things we can’t forget — for most people, the future is going to be pretty awesome. We can’t let ourselves forget that we will be surprised, and we can’t discount that when we pick up an Ultrabook for the first time, we’ll say, “Wow, that feels really cool.” I think when we talk about the future of economies and the future of Intel, we can’t forget that in the future that wow is still going to happen — and that is pretty cool.

What is your history in the tech industry?

My first job was at the computer lab at the local university in Virginia. That was back when every printer room had one printer, and that printer was in a soundproof box. And there was an entire room of Wang word-processing machines and a room full of mainframe terminals. I was there when they carted in the first personal computer. The joke was that it was called a personal computer because you could lift it by yourself.

So we have come a long way since then?

Oh, yeah! I always laugh about the computers that I learned to program on; today, we carry around more computational power in our pockets.

How has your work as a futurist informed your view of the industry as a whole?

Well, it has made me very boring, to be quite honest. I am a very pragmatic futurist. The work I do is for the specifications of processors, so I have to make sure that whatever visions I come up with are really grounded and that we can build them. If I tell Intel that we’re all going to have rocket cars and jet packs and come 2020 we don’t have rocket cars and jet packs, then this futurist won’t have a job.

Everything we do is based in social science first and foremost. We are designing processors, platforms and multiple products, and even the software and the algorithms that go into those products from a human standpoint. The futures we are looking at … need to be very accomplishable.

What affect do you see smaller screens and portable form factors having on the industry going forward?

Computation power has spread and found its way into our living rooms and pockets, and is finding its way into our cars, walls and hospitals. For the longest time people asked, “Will the PC kill the TV?” Now you hear them ask, “Will the smartphone kill the laptop?” or “Will the tablet kill the laptop?”

One device isn’t going to rule them all; it is about whatever device people have handy. People really like choice. People will watch “Inception,” a big blockbuster movie, on their big-screen TV at home, but if they happen to be stuck in an airport or on a bus, they will watch it on their smartphone. With that type of power on those small screens, computation fits more elegantly into people’s lives.

You have a smartphone, a tablet, an Ultrabook, a television — all these things begin to fit quite nicely together, becoming more about the consumer and the consumer’s choice about the kind of screen they would like to interact with.

As these high-powered mobile screens become more and more ubiquitous how do you see them affecting daily life?

They allow us to have access. With a lot of the research that I was doing in the more near-term, looking out to 2015, you have all these different screens and the computational power, input and output, battery life, computation and electricity which allow those screens to become windows that give you access to the people and the entertainment you love. That is what drives most people.

All of these mobile form factors and screens really give us a myriad of ways to make that connection in different places, in different areas and in different spaces, and I think that will only continue.

Are the differences between platforms becoming less important to the public at large?

It’s not just about processor speed or the type of processor. We have multi-core, many-core and single-chip cluster computers. There are different ways of bringing computational power and coming up with solutions to different problems — whether you want a tablet or a smartphone that lasts all day or you need a high-performance computer that needs to calculate particle physics for the large hadron collider. These are very different types of computation.

Inside Intel, it isn’t just about making it smaller, faster and less expensive, although this is important and it’s what we will continue to do — we live in the house of Moore’s Law. That is necessary but not sufficient. We have a significant shift where the way that people understand computational power has less to do with the guts and more to do with the experience.

 
Related stories

How to Predict the Future of Technology

Creating the technologies of tomorrow requires developing a vision of how people will interact with computing power in 10 to 15 years.

Brian David Johnson

Brian David Johnson, Intel's Futurist

Brian David Johnson is Intel’s “futurist,” which means his job is to look out 10 to 15 years ahead and develop plans that Intel engineers can use to create technology for, well, the future. His job is a complicated mix of sociology and research, looking deeply into how people interact with computers and computation today to anticipate how it will evolve over time.

In a recent interview with Adrenaline, a glossy magazine for software developers published by Intel’s Software and Services Group, Johnson talked about his job, his role as a social scientist and the human element of design.

How do you go about projecting 10 to 15 years into the future?

We start with social science. We have, in our lab, ethnographers and anthropologists who go all over the world to study people and give us insights into human behavior — how humans communicate with each other, how humans live, how people interact with their governments, how they buy things and what their cars are like. Whatever you can think of, they are looking at it.

That gives us a basis — we have to remember that we are building products for us, for people. From there, I look at the computer science side of things: the people who are doing the innovative hardware and software development that goes on at Intel.

Next we ask, “What is possible with technology?” We look back at those human insights and ask, “OK, how do we make people’s lives better?”

Then I like to look at trends, what I call the math of the future. Most people start with population growth and the projections of where we are going. Although those are important to me, they aren’t as important as the first two steps — social science and computer science — because, again, we have to understand the people we are building for, and then we have to understand the technology that we are building.

In terms of computing, what do you see the future looking like in 20 years?

I am an incredible optimist for a number of reasons. Everything I do is based upon social science research. Usually when you talk to people about computers, devices and gadgets, they’re generally very optimistic. They think it is cool.

That is one of the things we can’t forget — for most people, the future is going to be pretty awesome. We can’t let ourselves forget that we will be surprised, and we can’t discount that when we pick up an Ultrabook for the first time, we’ll say, “Wow, that feels really cool.” I think when we talk about the future of economies and the future of Intel, we can’t forget that in the future that wow is still going to happen — and that is pretty cool.

What is your history in the tech industry?

My first job was at the computer lab at the local university in Virginia. That was back when every printer room had one printer, and that printer was in a soundproof box. And there was an entire room of Wang word-processing machines and a room full of mainframe terminals. I was there when they carted in the first personal computer. The joke was that it was called a personal computer because you could lift it by yourself.

So we have come a long way since then?

Oh, yeah! I always laugh about the computers that I learned to program on; today, we carry around more computational power in our pockets.

How has your work as a futurist informed your view of the industry as a whole?

Well, it has made me very boring, to be quite honest. I am a very pragmatic futurist. The work I do is for the specifications of processors, so I have to make sure that whatever visions I come up with are really grounded and that we can build them. If I tell Intel that we’re all going to have rocket cars and jet packs and come 2020 we don’t have rocket cars and jet packs, then this futurist won’t have a job.

Everything we do is based in social science first and foremost. We are designing processors, platforms and multiple products, and even the software and the algorithms that go into those products from a human standpoint. The futures we are looking at … need to be very accomplishable.

What affect do you see smaller screens and portable form factors having on the industry going forward?

Computation power has spread and found its way into our living rooms and pockets, and is finding its way into our cars, walls and hospitals. For the longest time people asked, “Will the PC kill the TV?” Now you hear them ask, “Will the smartphone kill the laptop?” or “Will the tablet kill the laptop?”

One device isn’t going to rule them all; it is about whatever device people have handy. People really like choice. People will watch “Inception,” a big blockbuster movie, on their big-screen TV at home, but if they happen to be stuck in an airport or on a bus, they will watch it on their smartphone. With that type of power on those small screens, computation fits more elegantly into people’s lives.

You have a smartphone, a tablet, an Ultrabook, a television — all these things begin to fit quite nicely together, becoming more about the consumer and the consumer’s choice about the kind of screen they would like to interact with.

As these high-powered mobile screens become more and more ubiquitous how do you see them affecting daily life?

They allow us to have access. With a lot of the research that I was doing in the more near-term, looking out to 2015, you have all these different screens and the computational power, input and output, battery life, computation and electricity which allow those screens to become windows that give you access to the people and the entertainment you love. That is what drives most people.

All of these mobile form factors and screens really give us a myriad of ways to make that connection in different places, in different areas and in different spaces, and I think that will only continue.

Are the differences between platforms becoming less important to the public at large?

It’s not just about processor speed or the type of processor. We have multi-core, many-core and single-chip cluster computers. There are different ways of bringing computational power and coming up with solutions to different problems — whether you want a tablet or a smartphone that lasts all day or you need a high-performance computer that needs to calculate particle physics for the large hadron collider. These are very different types of computation.

Inside Intel, it isn’t just about making it smaller, faster and less expensive, although this is important and it’s what we will continue to do — we live in the house of Moore’s Law. That is necessary but not sufficient. We have a significant shift where the way that people understand computational power has less to do with the guts and more to do with the experience.

 
Related stories