Samsung: AI expands creativity with Galaxy S26 camera

AI helps the Galaxy S26 overcome hardware limitations, making post-production smarter, improving the quality of high-zoom photos, thereby upgrading the photography experience, according to Mr. Joshua Cho.

On February 26, the Galaxy S26 series was officially launched at the Galaxy Unpacked 2026 event, marking the third generation in the Korean giant’s journey of developing AI-integrated Galaxy phones. Among the outstanding AI experiences, creative photography capabilities attract the attention of many technology enthusiasts. Analyzing more deeply about the advances in the camera of the Galaxy S26 series, Mr. Joshua Cho, Executive Vice President (EVP) in charge of developing the Galaxy camera and the entire image processing system, had a sharing session with VnExpressclarifying changes in camera hardware and the role of AI in photography.

 

Mr. Joshua Cho at the press meeting, on the sidelines of the Galaxy Unpacked 2026 event. Photo: Samsung

– Camera is one of the core elements on the Galaxy S series. On the latest generation, what philosophy is the camera experience developed based on?

– Samsung’s camera development philosophy has been built over more than 40 years, including the previous period of DSLR camera development. For us, the camera is not just a recording tool, but a “language of emotions”, allowing users to preserve and share moments in all conditions – from day to night, from motion to still life.

On the Galaxy S26, this philosophy is concretized through 5 pillars: light, human factors, the ability to shoot anytime, creativity for all and AI that overcomes hardware limitations. In particular, AI plays an increasingly larger role, especially when Samsung switches to an AI-based image processing system (AI ISP), and expands editing tools such as Photo Assist or Creative Studio.

– Many people say that the camera on the Galaxy S26 has not changed much. Can you explain the extent of hardware changes in this generation?

– As for the aperture change, the sensor is unchanged, but the entire lens and module system has been redesigned to support the new aperture. When the aperture is larger, the depth of field becomes shallower, which causes the complexity of the autofocus (AF) system to increase significantly. Therefore, we had to refactor the entire system to ensure performance, and this is a big change, not just a small adjustment.

– Samsung previously introduced variable aperture on the Galaxy S9, why hasn’t this technology returned to recent generations?

– Variable aperture was introduced on the Galaxy S9, until now this technology is still being researched and continuously monitored. We’re always evaluating whether the feature brings real value to users and when the time is right to bring it back. Specific answers will come from future products.

– Will features like Photo Assist and Creative Studio be brought to older devices like the S24 or S25? If so, what are the advantages of Galaxy S26?

– Extending features like Photo Assist and Creative Studio to older devices will follow the overall Galaxy AI roadmap. Samsung has plans to expand, but it will be in specific stages.

For the Galaxy S26, the advantage lies in the fact that these features are deployed as soon as possible, and the on-device processing performance (right on the device), including pre-processing and post-processing, will be faster and more powerful than older devices.

Regarding the APV codec, Samsung is aiming for an open standard. This means other manufacturers can also use it, and the codec is compatible with the existing ecosystem, including tools like DaVinci Resolve.

– In the context of increasingly developing artificial intelligence, which can create photos and videos just from a prompt, what will the future of smartphone cameras be like?

– AI is developing so rapidly that it is difficult to predict exactly what will happen in the future. Currently, generative AI technology is not yet complete, so on Galaxy S26, we do not apply generative AI at the photography stage.

Instead, AI is used in post-production, for example in the Gallery or when improving the quality of high-zoom photos. In the future, it is possible that AI will partially replace camera hardware and this is also a common question for the entire industry. However, for now, we continue to develop both hardware and AI in parallel. Major changes will be demonstrated through future products.

– Taking concert photos is one of the common needs of Galaxy S Ultra users. Some companies have added “stage mode”, what is Samsung’s plan?

– We believe that with the current system, Galaxy can deliver very good quality concert photography. However, it is true that many people want more specialized regimes and we are considering how to meet these needs in the future.

Currently, Samsung does not apply generative AI during the photography process. The reason is that this technology still has limitations, especially the “hallucination” phenomenon, which can create unwanted details. Instead, AI zoom is applied in post-production in Gallery, where users can improve the quality of photos after taking them.

 

Galaxy S26 series devices. Image: Samsung

– Many companies are using 200 megapixel periscope cameras, even sensors manufactured by Samsung. How do Samsung’s flagship lines plan to change their configuration?

– In fact, the best image quality depends not only on the sensor, but on the optimal combination of sensor and processor (AP). Currently, the configuration on the Galaxy S26 is the optimal combination for the user’s needs.

However, we are still testing other configurations. When there is a new combination that brings better value to users, we will consider applying it.

By Editor