Sony recently filed a patent for an artificial intelligence capable of automatically generating and adjusting musical patterns and their tempo in video games. Given the firm’s determination to capitalize on sound immersion, will the PlayStation 5 be able to take this new technology into account?
In the future, will video game composers have to collaborate with artificial intelligence? A recent patent filed by Sony has, in any case,put the subject on the table. On May 22, a patent for an invention relating to the “creation of dynamic music in a game” was made public. The latter is officially summed up as follows: “This invention concerns a process and a dynamic music creation system. An emotion is attributed to one or more musical motifs and a game vector is associated with emotion. The musical motifs are mapped on the game vector on the basis of emotion. A musical composition is generated on the basis of the game vector and the desire demotions.”
Playing more with emotions
In other words,at first, artificial intelligence designed by the Japanese firm would first collect a lot of musical data and sort it according to the emotions usually associated with certain recurring patterns. The AI would then be injected with scenes and narrative elements of the targeted video game to which emotions will have to be associated. The human composer would intervene only en after, by providing him with musical motifs of his creation associated with specific characters, levels or game play elements. . This association could even be carried out by the AI itself,dynamically, in full part. Finally, the algorithm would only have to generate or adjust a musical pattern based on the duration of a session, the virtual in the game or even the player’s style of play. Indeed, according to the analysis of The Next Web, the tempo of the music could even be adapted dynamically depending on whether the player prefers,for example, whether his character walks or runs. In addition,in its patent, Sony even plans to add a more personal dimension to its system: if the player shares data related to his personality, the AI would be able to analyze them and make musical changes in the game!
Immersion by sound?
If this new invention, still at a very conceptual stage, may never see the light of day, it is not trivial. Sony seems to really want to capitalize on video game immersion through sound.. “A game looks dead without an audio dimension,” project manager Mark Cerny said at a conference in March. Thus, the latter had even announced that the manufacturer had created, especially for the PlayStation 5,the Tempest 3D Audio Audio engine that is supposed to accentuate the”intensity of a gaming experience”. Thanks to it, developers will already be able to set up up to 100 different multi-dimensional sound sources to allow the player to locate a noise at the exact location (and not just its orientation) from where it comes from in the virtual world. The idea also,for Sony, is to offer a “good audio quality for all””, so as not to create inequality according to the equipment of the players. It may well be that one day the PS5 will be able to implement this new kind of musical creation.