Skip to main content

Real-time adaptable and coherent rendering for outdoor augmented reality

Abstract

This paper describes the development and evaluation of a color estimation method that is able to create more natural lighting conditions for outdoor-purposed augmented reality (AR) technology. In outdoor AR systems, the real outdoor light source (i.e., the sun) illuminates real objects, while a virtual light source illuminates the augmented virtual objects. These two light sources result in color differentials, with the real object and virtual object being visualized as a mixture of the colors induced by the two light sources. As such, there is a visible difference in color between the real object and the virtual object. Consequently, this visible color difference will vitiate the sense of immersion felt by the AR user. Thus, to overcome this problem, we have defined each RGB color channel value by analyzing the color generated by the outdoor light source and applied the defined values to the virtual light source to reduce the visibility of the color differential between the two light sources, thereby reducing the visualized incompatibility between the virtual object and the real background. In addition, using virtual objects to express weather events, in combination with the color estimation method, we were able to demonstrate that the proposed method can adequately adapt to and manage the weather changes that affect outdoor AR. The proposed method has the potential to improve the visual coincidence between the real outdoor background and virtual objects.

1 Background

Recently, with the rapid development and diffusion of information technology and mobile-device-related technology within South Korea, approximately 80% of the population have come to own a personal mobile device that is equipped with a camera [1]. The rapid increase in the number of South Koreans who own a mobile device has created the opportunity for many people to use augmented reality (AR) technology without space constraints. For example, AR has become popularized with the recent release of Pokemon Go; in addition, the Gartner group has tracked AR trends for a number of years and used their findings to predict the prospects of future AR technology [2].

However, despite the increased interest, AR still has many obstacles to overcome, particularly with respect to information design, interaction methods, and equipment and ergonomics problems [3].

Information design incorporates a technique to determine the rendering style of a virtual object as based on various conditions of the real background viewed by a user. This synchronization technique is performed by altering the rendering style of the virtual object. The realization of the virtual object as a result of changing the rendering style amplifies the feeling of immersion experienced by the user and strengthens the power of information transfer. Because of these benefits, synchronization technology research is considered as necessary to ensure that users continue to use the content; this is especially relevant for outdoor use AR applications. Despite this reality, AR research is currently focused more on content diversity and methods to provide information than the application of synchronization technology. As an example, Pokemon Go does not incorporate synchronization technology; thus, the rendering style of the augmented character is unchanged, even under the condition of significantly different background lighting conditions (Fig. 1a, b). As can be ascertained from the screenshot presented as Fig. 1b, the sense of immersion experienced by the user can be vitiated by the incompatible rendering of the augmented character.

Fig. 1
figure 1

Screenshots of Pokemon Go played at different times throughout the day: a 2 PM and b 8 PM

Thus, in this study, we investigated and explored the implementation of synchronization technology in AR by developing a synchronization method that uses color and weather information to adjust the lighting of virtual objects and thereby solve the problem demonstrated in Fig. 1. We also developed and tested algorithms that can be applied in real time in order to evaluate their practicability in real-time outdoor-based game platform applications. More specifically, we developed a synchronization method that reduces the heterogeneity between virtual objects and the real world in an AR field and includes processes such as camera filtering and rendering style conversion

1.1 Virtual object rendering by using direct lighting

Realizing the realistic lighting of a virtual object placed in a real scene is a high-priority goal of AR systems. However, most existing approaches require the installation of additional equipment. Debevec proposed a method to calculate realistic lighting values as based on the conditions of scenes and their surroundings as observed through a fish-eye lens mounted onto a camera [4]. In AR, a fish-eye lens camera is commonly used to focus the light source onto the surrounding environment [5]. However, because fish-eye lenses are not commercially available for mobile devices, at present, a method that incorporates them is not applicable to mobile devices. Furthermore, although this method can quantify the light source position and shadow of a virtual object, it does not consider the attributes of the colors contributing to the background lighting.

1.2 Realistic virtual object creation in a real outdoor environment

Kolivand and Sunar developed a technique to describe the interaction between the colors in the sky and virtual objects in AR technology with shadows. In addition to more realistic shadow projection, they focused on the projection of realistic virtual objects in an outdoor AR environment [6]. Kolivand and Sunar’s aim is similar to that of the study presented in this paper, as this study aims to increase the compatibility of virtual objects with the physical environment by determining the position of the sun with respect to the latitude, date, and time. In addition, in this study, the position and quality of shadows were determined by using a new algorithm, Z-GaF Shadow Maps, and the background illumination was set by using the ambient light of the skybox.

1.3 Utilizing natural environmental lighting in AR

Rohmer at al. used a mobile device with a depth sensor to implement a white balance-based color correction method. Their method was demonstrated to yield more photorealistic renderings as a result of utilizing the variants of differential light simulation techniques. Additionally, the proposed method was tailored to mobile devices and operated under the condition of an interactive frame rate [7]. Although their use of white balance is different from that applied in the method described in this paper, their results suggest that real-time photorealistic rendering can be achieved by considering the white balance.

2 Methods

2.1 Real-time color lighting representation technology for outdoor AR

2.1.1 Role of color lighting

The representative features of lighting include intensity, position, and color. Among these features, color significantly influences the ability of a system to perform human cognitive function (accuracy, reactivity, attention, etc.) recognition and time-lapse recognition [8, 9]. Furthermore, color temperature and color rendering are known to be the main factors affecting the color judgment of the user. Thus, the factors that affect lighting have the ability to induce a metamerism error, which could result in the same color being recognized as different colors or different colors being recognized as the same color [10]. This type of error phenomenon can be easily observed in outdoor environments, as numerous factors affect the perceived color of sunlight. As the sun appears to move across the horizon, the period of sunrise and sunset tends to correspond to a mixture of colors that is perceived as orange light, whereas noonday tends to correspond to a mixture of colors that is perceived as blue light. Therefore, if the color emitted by the source of illumination does not change over time, the perceived heterogeneity between the actual background and the virtual object can become increasingly apparent.

2.1.2 Color temperature definition

Color temperature refers to the temperature of a black-body radiator that emits light that is similar in color to that of the light source. A black body is an object that completely absorbs all of the radiant energy incident on the surface. As the black body is heated, its color changes to red, white, and blue. If the color temperature of the light source does not exactly match that of the black-body radiator, the color temperature with the closest value, i.e., the correlated color temperature (CCT), can be used; since the colors of the black-body radiator and light source never exactly match, CCT is used to represent the color temperature. Kelvin (K) is the unit used to express the color temperature [11]. In photography, color temperature is used to adjust the white balance, which is purposed to correct the color distortions observed as a result of taking pictures in an environment with natural or artificial lighting.

Figure 2 shows how outdoor photographs can be corrected by using white balance; it can be seen that the overall lighting differs according to the temperature. Considering this, the rendering style of an augmented virtual object can be corrected by applying the color temperature used in white balancing to determine the virtual lighting. Because this study is focused on the development of a technique that can be used to create more natural lighting, we have decided to apply the color temperature of the sun, which is the representative outdoor light source. During sunrise and sunset, the sun produces low-intensity lighting with a low color temperature of 2500 to 3500 K; in contrast, at noon, the sun produces white high-intensity lighting with a color temperature that ranges from 4000 to 7000 K [12]. Thus, the lowest temperature of 2500 K was implemented as the sunrise and sunset color temperature, and the maximum temperature of 7000 K was implemented as the noon color temperature.

Fig. 2
figure 2

White balance according to color temperature

After establishing the color temperature for these three events, the color temperature between sunrise/sunset and noon changes linearly according to the altitude of the sun, as is shown in Fig. 3.

Fig. 3
figure 3

Color temperature variation throughout a single day

2.1.3 Color temperature algorithm

In the previous section, we described how the color temperatures of sunrise/sunset and noon were determined. In this section, we explain why the lighting color should be expressed as based on the color temperature. In order to use the color temperature to determine the lighting color, the RGB values must be determined. In this study, the RGB values were determined by using a temperature-to-RGB conversion algorithm designed by Helland [13]; the pseudo-code is provided in Table 1.

Table 1 Algorithm pseudo-code [13]

Figure 4 shows the graph for each RGB attribute value as a function of the color temperature. In each graph, the color temperature of 6600 K and color value of 255 are marked; for all three color channels, it can be seen that the relationship between the color channel and color temperature sharply changes at 6600 K.

Fig. 4
figure 4

Graph of RGB channel values according to color temperature : a R channel, b G channel, and c B channel

Table 1 shows the derivation of the pseudo-code using the graph. Table 1 provides a high approximation, and real-time operation is possible. However, it is not sufficient for use in the scientific field.

Figure 5 illustrates the application of the color temperature algorithm to a game engine. Orange light can be observed in Fig. 5a, d, as the time of day corresponds to sunrise and sunset, respectively; conversely, white light can be observed in Fig. 5b, c, corresponding to approximately noontime.

Fig. 5
figure 5

Application of color temperature to game engine environment. a AM 08:00. b AM 11:00. c PM 03:00. d PM 07:00

2.2 Weather expression in outdoor-purposed AR

2.2.1 AR and weather

Abrupt changes in weather are frequent and highly visible and affect environmental objects such as trees. These weather features interfere with enhancing environmental awareness by dynamically overlapping virtual objects and the actual environment which are the characteristics of AR [14]. In this study, we attempted to solve this visualization interference problem by expressing weather as virtual objects.

2.2.2 Parsing weather information from open API service

The Korea Meteorological Administration weather change information application programming interface (API) was used to obtain URL-formatted weather data from a domestic weather forecast site. The time had to be entered into the API in the URL format; an example of the calculation method is presented in Fig. 6, which shows that the reference time (input time) changes every 40 min past the hour. For example, 09:00 is used for the period of 09:41–10:39. This means that the weather information can have a time delay of 40 min to 1 h and 39 min.

Fig. 6
figure 6

Example of the time calculation method

Time was input into the API in URL format, and weather information classified by specific tags was output in the XML format. Note that, via this interface, the user is only able to obtain the desired attributes through parsing. Additionally, the AR-based weather effects implemented in this study were limited to rain, snow, and wind, as only precipitation type, wind direction, and wind velocity information were parsed shown in Table 2.

Table 2 Item names according to tags

3 Results and discussion

Figure 7 shows how snow and rain were visualized by the game engine. The proposed method offers flexibility by allowing the user to control the amount of snow and rain. However, because the expression of wind is not as visible as that of snow or rain, it was indirectly expressed via a virtual object, as is shown in Fig. 8. Figure 9 presents an example of the effects of color temperature application in an AR environment. The representative time of day was at sunset when the color change would be noticeable. As can be seen, the blades of the virtual windmill in Fig. 9b have more of an orange hue than those of the virtual windmill in Fig. 9a. Furthermore, as can also be observed in Fig. 9b, application of color temperature to the virtual object makes it much more difficult to distinguish as a non-real object, as its color is consistent with that of the real object, i.e., the house. Figure 10 shows that the virtual weather objects can be easily integrated into a real environment and have the benefit of flexibility. Thus, because, unlike the background lighting, weather events can be dynamically expressed in the AR environment, a realistic time flow can be applied. This verifies the ability of the synchronization technology to effectively represent weather events.

Fig. 7
figure 7

Examples of snow and rain implemented in the game engine: a minimum rain setting, b maximum rain setting, c minimum snow setting, and d minimum snow setting

Fig. 8
figure 8

Virtual objects used to indirectly express wind effects: a windmill and b tree

Fig. 9
figure 9

Example of color temperature application in an AR environment: a no effect and b effects of color temperature applications

Fig. 10
figure 10

Example of weather expression in an AR environment: a no weather event, b rain, c snow, and d rain and snow

4 Conclusions

In this paper, we proposed a method to solve the visualization incompatibility problem in AR by altering the rendering style of virtual objects as based on real outdoor lighting color data and real weather attributes. As previously mentioned, we focused on outdoor-purposed AR because changes to the real background are diverse and extensive. Consequently, the lighting color and weather event settings were able to be expressed in real time, as demonstrated by implementing the method in a game engine. Moreover, since the proposed method was developed for application in a game engine, it can be immediately integrated into other game engines. Furthermore, the results presented here suggest that the use of real data can reduce the visualization disparity between virtual objects and real background scenes. Thus, further research is necessary to evaluate the adaptability of this technique and confirm that the results can be replicated in an actual outdoor AR environment. In the future, we plan to improve the realism of lighting color and to diversify weather expression.

Abbreviations

API:

Application programming interface

AR:

Augmented reality

CCT:

Correlated color temperature

URL:

Uniform resource locator

XML:

eXtensible markup language

References

  1. M Weiser, The computer for the 21st century. ACM SIGMOBILE Mob. Comput. Commun. Rev. 3(3), 3–11 (1999).

    Article  Google Scholar 

  2. J-I Hwang, Research trends in mobile augmented reality and prospects. Korea. Inst. Inf. Technol. 11(2), 85–90 (2013).

    MathSciNet  Google Scholar 

  3. GJK Youngsun Kim, Interaction design and interface usability considerations for the proliferation of augmented reality based contents and services. Commun. Korean Inst. Inf. Sci. Eng. 29:, 31–37 (2011).

    Google Scholar 

  4. P Debevec, in Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques. Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography (ACMNew York, 1998), pp. 189–198.

    Google Scholar 

  5. J-M Frahm, K Koeser, D Grest, R Koch, in Conference on Visual Media Production CVMP, London. Markerless augmented reality with light source estimation for direct illumination (IETStevenage, 2005), pp. 211–220.

    Google Scholar 

  6. H Kolivand, MS Sunar, Realistic real-time outdoor rendering in augmented reality. PloS ONE. 9(9), 108334 (2014).

    Article  Google Scholar 

  7. K Rohmer, J Jendersie, T Grosch, Natural environment illumination: coherent interactive augmented reality for mobile and non-mobile devices. IEEE Trans. Vis. Comput. Graph. 23(11), 2474–2484 (2017).

    Article  Google Scholar 

  8. W-S Chong, M Yu, T-K Kwon, N-G Kim, Study on the effect of cognitive function by color light stimulation. J. Korean Soc. Precis. Eng. 24(10), 131–136 (2007).

    Google Scholar 

  9. H-J Suk, G-M Kim, The influence of chromacity of led lighting on time perception. Korean J. Sci. Emot. Sensibility. 13:, 69–78 (2010).

    Google Scholar 

  10. SJ An, A study on the characteristics of chromatic cognition based on the ambient lighting. J. Soc. Korean Photogr. AURA. 30(1), 61–72 (2013).

    Google Scholar 

  11. A Choi, J Lee, B Park, Development and application of health lighting plan in residential areas. J. Archit. Inst. Korea. 20(10), 287–294 (2004).

    Google Scholar 

  12. Y-J Park, J-H Choi, M-G Jang, Optimization of light source combination through the illuminance and color temperature simulation of circadian lighting apparatus. J. Korea Contents Assoc. 9(8), 248–254 (2009).

    Article  Google Scholar 

  13. T Helland, How to convert temperature (K) to RGB: algorithm and sample code (2012). http://www.tannerhelland.com/4435/convert-temperature-rgb-algorithm-code/. Accessed 11 Oct 2017.

  14. P-H Wu, G-J Hwang, M-L Yang, C-H Chen, Impacts of integrating the repertory grid into an augmented reality-based learning design on students’ learning achievements, cognitive load and degree of satisfaction. Interact. Learn. Environ. 26(2), 221–234 (2018).

    Article  Google Scholar 

Download references

Acknowledgements

The authors thank the editor and anonymous reviewers for their helpful comments and valuable suggestions.

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean Government (Ministry of Science and ICT) (No. 2016R1D1A1B03935378 and No. 2017R1C1B5075856).

Availability of data and materials

The data used and/or analyzed during the current study are available from the authors on reasonable request.

Author information

Authors and Affiliations

Authors

Contributions

All authors take part in the discussion of the work described in this paper. SHS proposed the framework of this work and initiated the main algorithm of this work. DWK and SOP carried out the whole experiments and drafted the manuscript. All authors read and approved the final manuscript.;

Corresponding author

Correspondence to Sangoh Park.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional information

Authors’ information

Sanghyun Seo received his B.S. degrees in Computer Science and Engineering from Chung-Ang University, Seoul, Korea, in 1998 and M.S. and Ph.D. degrees in GSAIM Dep. at Chung-Ang University, Seoul, Korea, in 2000 and 2010. He was a senior researcher at G-Inno System from 2002 to 2005. He was the postdoctoral researcher at Chung-Ang University, in 2010, and the postdoctoral researcher at LIRIS Lab, Lyon 1 University from February 2011 to February 2013. He had worked at the ETRI (Electronics and Telecommunications Research Institute), Daejeon, Korea, May 2013 to February 2016. Now, he is currently a faculty of the Department of Media Software at Sungkyul University. He has been a reviewer in Multimedia Tools and Applications (MTAP), Computer and Graphics UK (Elsevier), Journal of Supercomputing (JOS), Visual Computer (Springer), and Program Committee member in many international conferences and workshops and has edited a number of international journal special issues as a guest editor, such as Journal of Real-Time Image Processing and Journal of Internet Technology, and Multimedia Tools and Applications. He has been appointed as an Associate-Editor of the Journal of Real-Time Image Processing since 2017. His research interests are in the area of computer graphics, non-photorealistic rendering and animation, 3D GIS system, and real-time rendering using GPU, VR/AR, and game technology.

Dongwann Kang is an Assistant Professor in the Department of Computer Science and Engineering at Seoul National University of Science and Technology. He received his Ph.D. from Chung-Ang University in Korea in 2013, where he was a research fellow until Jun 2015. He was a lecturer of Undergraduate Interdisciplinary Program in Computational Sciences, Seoul National University, Korea (from Mar 2014 to Jun 2015); a lecturer at the Department of Multimedia, Sookmyung Women’s University, Korea (from Mar 2014 to Dec 2014); and a visiting researcher (from Jul 2015 to Jan 2018) and a Marie Sklodowska-Curie fellow (from Feb 2018 to Aug 2018) at the Faculty of Science and Technology, Bournemouth University, UK. His research interests include non-photorealistic rendering and animation, emotional computing, image manipulation, and GPU processing.

Sangoh Park received B.S., M.S., and Ph.D. degrees from the School of Computer Science and Engineering at Chung-Ang University, in 2005, 2007, and 2010, respectively. He has been serving as an Assistant Professor of the School of Computer Science and Engineering at Chung-Ang University since 2017. He served as a Senior Researcher of Global Science experimental Data hub Center at Korea Institute of Science and Technology Information from 2012 to 2017 and a Research Professor at the School of Computer Science and Engineering. His research interests include high-performance computing, big data system, tape storage system, embedded system, cyber physical system, home network, smart factory, and linux system.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Seo, S., Kang, D. & Park, S. Real-time adaptable and coherent rendering for outdoor augmented reality. J Image Video Proc. 2018, 118 (2018). https://doi.org/10.1186/s13640-018-0357-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13640-018-0357-8

Keywords