NEC progresses in AI, targeting generative AI applications and LLMs for industry innovation by 2024.
NEC’s AI shift will include launching new services and partnering for industry-specific AI models, targeting growth and tech advancement.
NEC’s 2023 AI focus is on developing a unique LLM and AI hub to boost healthcare, finance, and manufacturing efficiency.
With a strong focus, NEC is riding the wave of AI enthusiasm this year. Standing out in its portfolio is its bespoke LLM, tailored specifically for the Japanese market. The model operates with a substantial 13 billion parameters, starkly contrasting the trillion parameters utilized by OpenAI’s GPT-4.
Introducing the NEC generative AI service
The firm has announced the launch of its NEC generative AI service, which will begin by offering licenses for its unique LLM, along with specialized hardware, software, and consulting services. This initiative is set to expand into new markets beyond the fiscal year 2024.
In a strategic move, NEC has teamed up with around ten companies and universities to kick off the NEC generative AI advanced customer program starting in July. This program is designed to collaborate with clients in developing tailored models and advancing software and organizational structures for the effective use of LLM.
NEC has also begun the development of the NEC generative AI hub. This hub fosters business transformation by providing customers access to a pool of researchers, engineers, and consultants. These experts are tasked with delivering precise guidance for creating bespoke AI systems.
The introduction of these services is poised to significantly broaden the scope for operational transformation across diverse sectors, including healthcare, finance, municipal governments, and manufacturing. Furthermore, NEC is focusing on crafting specialized models to spearhead business transformation and enhance generative AI applications from single businesses to entire industry sectors through its managed API services.
NEC’s impact across industries
NEC has made significant advancements in its LLM by doubling the training data quality, resulting in a model surpassing several leading LLMs in domestic and international markets, particularly in Japanese dialogue capabilities (Rakuda*). The LLM’s capacity to process up to 300,000 Japanese characters—up to 150 times more than other third-party LLMs—positions it as a versatile tool for handling large-scale document-related operations, including business manuals.
NEC has made significant advancements in its LLM. (Source – NEC)
NEC is developing a “new architecture” that promises to “revolutionize” the creation of AI models. This architecture allows for flexible combinations of models based on the input data and specific tasks.
NEC aims to establish a scalable foundational model to expand parameters and functionality efficiently. This model promises scalability without compromising performance and can be seamlessly integrated with various AI models, including those specialized in legal and medical fields and those from other companies and partners. Its compact size and low energy consumption make it suitable for edge devices. With NEC’s cutting-edge image recognition, audio processing, and sensing technologies, these LLMs can process real-world events with high precision and autonomy.
Simultaneously, NEC is developing a much larger scale model with 100 billion parameters, far exceeding the conventional 13 billion parameters. These endeavors are part of NEC’s ambition to achieve roughly 50 billion yen in sales from its generative AI-related business within three years.
The development and application of generative AI have recently seen a rapid acceleration. Businesses and public institutions are actively exploring and testing business reforms using various LLMs, and the demand for such transformations is anticipated to rise.
Challenges in its implementation remain, though, including the need for precise, prompt engineering to guide AI, security issues like data leaks and vulnerabilities, and the integration of business data during the implementation and operational phases.
Since the launch of the NEC generative AI service in July 2023, NEC has been capitalizing on the NEC Inzai data center. This center provides a secure and low-latency LLM environment. NEC has been accumulating expertise by constructing and offering “individual company models” and “industry-specific models” that are ahead of the curve in the industry, using its proprietary LLM.
NEC is utilizing this accumulated expertise to deliver optimal solutions for various industries. These solutions comprise a scalable foundational model for LLM and an environment tailored to the specific needs of customers’ businesses for the effective use of generative AI.
NEC’s generative AI business strategy for other potential applications
NEC’s generative AI business strategy is built around a three-stage development process using managed API services. The core of this strategy is a managed API service that utilizes large-scale language models like LLM developed by NEC for dialogue and search functions. The business strategy is planned to unfold in three phases: first, by constructing industry- and business-specific models for individual companies; second, by integrating these models into business packages and solutions; and third, by developing business packages and solutions through partnerships with other companies.
The development of generative AI at NEC should fascinate the industry.
NEC is also enhancing its technology development and sales framework. This initiative includes the NEC generative AI hub, which operates under the direct supervision of the chief digital officer (CDO), and the digital trust promotion management department. Furthermore, to advocate for the generative AI business, ‘ambassadors’ have been appointed within the corporate sales department.
To further bolster its generative AI initiatives, NEC has established a new generative AI center. This center oversees the generative AI technology pivotal to these activities. It integrates approximately 100 leading-edge researchers from global research centers specializing in generative AI. The aim is to fast-track the commercialization of generative AI research by fostering seamless collaboration between the research and business segments.
NEC’s generative AI business strategy. (Source – NEC).
As part of its commitment to providing safe and reliable LLMs, NEC is collaborating with Robust Intelligence on the LLM Risk Assessment Project. This initiative is focused on ensuring that the industry- and business-specific models NEC offers are evaluated and meet global standards for risk assessment.
As NEC continues to pioneer AI innovations, it remains dedicated to consulting with customers to enhance the value of its services, including expanding service functions. NEC’s commitment extends to providing secure and reliable generative AI solutions aimed at addressing the unique challenges faced by its customers.
As a tech journalist, Zul focuses on topics including cloud computing, cybersecurity, and disruptive technology in the enterprise industry. He has expertise in moderating webinars and presenting content on video, in addition to having a background in networking technology.