How do you think 5G technology will impact video games?
Logically, the first thing that comes to mind is online gaming. It could be enriched, with more data transferred between players. Imagine that in a game, I share data about position, life, the weapon used, shot, etc. It’s not too much data because I want to do it at the maximum possible speed. But if I have enough speed and reliability, I can transfer data about much heavier elements like textures and animations. This is how it would enrich multiplayer systems. You can even think of sending real-time content to the game like, for example, new maps.
Do you think that new ways of playing or new genres will appear?What can these technologies combined do?
I don't think VR or AR will change. What will change is the content we can represent. As I said before, you can transmit richer data. For example, I can send much more detailed VR avatars and significantly improve the experience.
The cloud seems to be the next step in the evolution of gaming hardware. But the most well-known cloud solution, Google Stadia, does not seem to be achieving great success. What steps do you think we need to take to exploit the potential of the cloud for the gaming industry? What do you think Google has done wrong, if anything?
I don't think Google is doing it wrong. I believe users aren’t quite ready to leave behind their gaming consoles that generate ever-improving experiences (PS5, for example). I'm not a Stadia user, but the news I'm getting is not very good. There are still latency issues, and it doesn't play as "idyllically" as everyone thought it would. You only have to look at the sales of the new generation consoles, even though there are problems with the supply of components. They’ve exceeded all of the expectations. Besides, in the search for edge computing models, how does this model of transmitting 4K images 60 times per second make sense? Seems strange to me.
Broadly speaking, what possibilities does edge computing technology offer from a technological and business point of view?
This philosophy that allows large data centers in the cloud to "delegate" part of their responsibilities to this type of device (edge) seems to be more than reasonable. Let's consider its deployment progression (about 30 billion IoT devices are expected by 2021). Even if I believe that 5G can breathe new life into this immense flow of data, I think that, if only for security, efficiency, and cost of data processing, the idea that devices should be as autonomous as possible is still a good idea.
What infrastructure challenges does the massive adoption of edge computing bring to the table?
The only challenge I see is that the devices have to be more autonomous. In other words, they have to be more complex, with more hardware, and therefore more expensive. They are no longer "dumb" elements that send data without any knowledge.
As edge computing technology is implemented, our IoT devices are also going to change. For example, they should focus on more specific functions. What other changes are needed?
In the end, there has to be a balance between data processed locally and data sent to the cloud. The most significant changes will be in the IoT devices, which, as I was saying before, will increase in cost due to their new self-processing capability.