.png)
Business Area
From data centers
to edge devices.
We support AI training and inference from data centers to edge devices using AI accelerators from leading global companies such as HyperAccel, DEEPX, NVIDIA, and AMD.

Central Processing Unit
CPU
Offering a range of GPU-based servers with the latest Intel and AMD CPUs to give customers more choices.

Neural Processing Unit
NPU
Offering NPUs optimized for diverse edge environments with ultra-low power and high performance using the DX-M1, DX-V3, and DX-H1 series.

Vision Processing Unit
VPU
Providing VPUs (Video Processing Units) suitable for services such as live streaming, security surveillance, and cloud gaming.

Graphics Processing Unit
GPU
Providing solutions to optimize AI model performance using Blackwell- and Hopper-based GPU systems.

LLM Processing Unit
LPU
Providing LLM Processing Units (LPU) that can dramatically reduce generative AI inference costs in both data center and edge environments.

Data Processing Unit
DPU
Providing LLM Processing Units (LPU) that can dramatically reduce generative AI inference costs in both data center and edge environments.
.png)
Challenge
DIA NEXUS
AI Accelerator Portfolio

Recommendations tailored for training and inference purposes
Daewon CTS clearly understands these characteristics and provides accelerator solutions optimized for each specific use case.

Proposals suited to specific operational environments and conditions
Daewon CTS provides customized solutions tailored to each environment, from data center rack space and facility requirements to low-power constraints and installation space conditions at the edge.

AI infrastructure
architecture design
With expertise in designing AI infrastructure for heterogeneous accelerator environments, Daewon CTS offers optimal system architectures and management strategies aligned with customer needs.

Partner Ecosystem
DIA NEXUS
Partner Ecosystem
Daewon CTS develops adoption and implementation strategies from solution selection to integration with existing systems, establishing build and execution plans from an AI full-stack perspective.


DEEPX, an AI semiconductor specialist, develops NPUs and solutions for servers, edge devices, and IoT devices, and has recently expanded into the LLM inference–focused semiconductor market.
NPU
Blocks unauthorized AI applications to ensure only approved apps are used and security policies are enforced. It also detects and immediately blocks malicious URLs, files, and content in data sent to or responses from generative AI applications.
Module
We offer DX-M1 and DX-H1 modules, designed to easily integrate AI capabilities into a variety of edge devices. This enables seamless addition of AI functionality across diverse edge environments.
DXNN
DEEPX supports custom NPU development through the DXNN SDK, and its proprietary NPUs achieve higher power efficiency and incorporate unique patented technologies compared to traditional GPUs.

.png)
