US20240268541
2024-08-15
Human necessities
A45D44/005
A method has been developed for creating photorealistic renderings of cosmetic products. This process involves the use of advanced imaging techniques to apply cosmetic effects realistically to different individuals based on reference images.
The first step in the method is to obtain a reference image (Xref) that showcases a real cosmetic product (PC) applied to one individual (P1). Additionally, at least one source image (Xjsource) of another individual (P2) is required. These images serve as the basis for generating the final rendering.
An encoding artificial neural network (E) is then employed to analyze the reference image (Xref). This network determines key characterizing parameters (E(Xref)) of the cosmetic product, which are essential for accurately replicating its appearance on different skin tones and textures.
Following the parameter extraction, a realistic physically based rendering engine (R) is implemented. This engine utilizes the characterizing parameters obtained from the neural network along with the source image (Xjsource) to create a transformed image (R(Xjsource, E(Xref))). The result is a photorealistic rendering of the cosmetic product applied to the second individual (P2).
This method has significant implications for the cosmetics industry, enabling brands to offer virtual try-on experiences. By allowing potential customers to see how products would look on them without physical application, it enhances customer engagement and satisfaction while streamlining the purchasing process.