Classic business application platforms are structured in a complex, sophisticated way with data model parts being tightly incorporated into each other to represent the overall structure of the system. Data model complexity is correlating with platform development speed, which makes it impossible to quickly update a data model and move on directly to the new application development phase.
Through its history, AI has gone through different periods of development and social acceptance. For the last 5 decades, it has seen periods of research stagnation, as well as periods of loss of interest. Highlights in breakthrough was the amounting interest in expert systems during the 80s, the chess-winning supercomputer Deep Blue from IBM in 1997, and intelligent agents and knowledge-based systems from the last decade.
The evolution of hardware has been a steady pace of decreasing size and reducing the distance for signaling. In doing so, it has constantly given us higher performance. When you look at software, the basic principles have remained the same for decades, however, as with everything else, the software industry is making improvements.
A few weeks ago our CEO Kristoffer Lundegren did an interview for Virtual-Strategy Magazine. This interview can be found on Virtual-Strategy Magazine (link below) Interview with Kristoffer Lundegren, CEO of
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.