Thong Pham and Mark Keith, Information Systems
As the technologies enabling mobile and ubiquitous information rapidly evolve, so do the information privacy risks to consumers (Belanger and Crossler 2011; Pavlou 2011; Smith et al. 2011). Perceived privacy risk has been demonstrated to be a critical factor in both information disclosure intentions (Dinev and Hart 2006; Lowry et al. 2011; Xu et al. 2010) and behaviors (Keith et al. 2013; Lowry et al. 2011; Posey et al. 2010).
However, when forming information privacy risk perceptions concerning specific mobile applications or e-commerce channels, consumers do not have access to the relevant information necessary to make a rational decision (Keith et al. 2012). There is an asymmetry of information between the consumer and provider—particularly if the provider has unethical intentions with the consumer’s data. Although a provider’s stated privacy assurances and third party seals may provide consumers with an honest depiction of their intentions (Lowry et al. 2012; Reay et al. 2009), these policies are easily ignored. Perhaps more likely, consumers would rely instead on heuristic signals of privacy risk such as brand recognition (Erdem et al. 2006) and social influences (Adomavicius et al. 2013). Thus, the purpose of this study is to measure the effect of different privacy cues on consumer privacy perception in order to educate and prepare them for information privacy risks presenting on different mobile apps.
There are two research questions we tried to answer:
- How do various privacy signals (brand, social influence, institutional assurances) affect the perceived risk, and actual disclosure of, personal information?
- How does perceived privacy risk change as various signals are sequentially versus simultaneously introduced (i.e. whether an “anchor” is introduced before the adjustment caused by signals)?
To examine the research questions above, we draw from theory on the Elaboration Likelihood Model (Lowry et al. 2012) as well as the Anchoring and Adjustment phenomenon (Tversky and Kahneman 1974) to explain the process by which mixed privacy signals are resolved and information disclosure decisions made.
To test our model, we employed a randomized experiment involving mobile apps which present many benefits and risks associated with information disclosure. With IRB approval, we recruited participants under the false pretense that a local market research consulting firm would like their help in evaluating one of several apps that various companies have hired the firm to “beta test” with consumers. The participants would see a list of companies who’ve submitted apps for research and then rate their perception of brand credibility for each company. The online tool we’ve developed to administer this experiment and the mobile apps would then randomly assign them to review an app prototype featuring the brand logo of the company they rated as having either the highest or lowest credibility. In addition, half of the participants would be primed by having them read a short paper by a privacy expert providing objective data on the risks of information disclosure.
We found that consumers’ perceived privacy risks can be manipulated not only by objective data (e.g. expert opinion and privacy policies) but also by heuristics such as social influence and brand recognition. Also, the order of the cues can affect the amount of adjustment in consumers’ perception. Thus, we were be able to find different levels of adjustment among different group of participants.
The experiment was based on a population of more than 1000 students. These students are mostly from IS 110 taught by Dr. Keith. This large population is composed of students from different age groups and backgrounds that ensure a sample that can present the U.S. population. The amount of information disclosed in the experiment showed that (1) the benefit of using the app, (2) the brand credibility, (3) brand recognition, (4) and privacy risk did have notable impacts. Specifically, the benefit of using the app have the greatest impact on information disclosure which is understandable. People usually disclose their information because these information are required to use mobile apps or to use these apps efficiently. On the other hand, the coefficients in SmartPLS showed that the more credible the brand, the less information disclosed. This is quite unusual and required more analysis to figure out whether there is a flaw in the experiment. Nonetheless, the experiment is absolutely successful in proving the impact of different factors on information privacy perception.
It is important to understand that measuring information privacy perception is a very complicated task. In fact, there are many data points in this experiment that would take more time for the researchers to thoroughly investigate and analyze. The next step in this research is to form a complete model of how the Elaboration Likelihood Model (Lowry et al. 2012) as well as the Anchoring and Adjustment phenomenon (Tversky and Kahneman 1974) explain the process by which mixed privacy signals are resolved and information disclosure decisions made. We are looking forward to submitting our complete findings in one of the top Information Systems journals by June 2016.
- Adomavicius, G., Bockstedt, J.C., Curley, S.P., and Zhang, J. 2013. “Do Recommender Systems Manipulate Consumer Preferences? A Study of Anchoring Effects,” Information Systems Research (24:4), 2013/12/01, pp 956-975.
- Belanger, F., and Crossler, R.E. 2011. “Privacy in the Digital Age: A Review of Information Privacy Research in Information Systems,” MIS Quarterly (35:4), Dec, pp 1017-1041.
- Dinev T, Hart P (2006) An extended privacy calculus model for e-commerce transactions. Information Systems Research, 17(1):61-80.
- Erdem, T., Swait, J., and Valenzuela, A. 2006. “Brands as Signals: A Cross-Country Validation Study,” Journal of Marketing (70:1), pp 34-49.
- Kahneman D, Tversky A (1979) Prospect theory: An analysis of decision under risk. Econometrica, 47(2):263-291.
- Keith, M.J., Babb, J.S., Furner, C.P., and Abdullat, A. 2010. “Privacy Assurance and Network Effects in the Adoption of Location-Based Services: An Iphone Experiment,” in: Proceedings of the International Conference on Information Systems (ICIS ’10). St. Louis, MI: p. 237.
- Keith, M.J., Thompson, S.C., Hale, J., Benjamin Lowry, P., and Greer, C. 2013. “Information Disclosure on Mobile Devices: Re-Examining Privacy Calculus with Actual User Behavior,” International Journal of Human-Computer Studies (71:12), pp 1163–1173.
- Keith MJ, Thompson SC, Hale J, Greer C (2012) Examining the rationality of information disclosure through mobile devices, Orlando, FL,
- Keith MJ, Babb JS, Furner CP, Abdullat A (2011) The role of mobile self-efficacy in the adoption of location-based applications: An iphone experiment, Kauai, HI,
- Lassar W, Mittal B, Sharma A (1995) Measuring customer-based brand equity. Journal of consumer marketing, 12(4):11-19.
- McKnight DH, Choudhury V, Kacmar C (2002) Developing and validating trust measures for e-commerce: An integrative typology. Information Systems Research, 13(3):334-359.
- Laufer, R.S., and Wolfe, M. 1977. “Privacy as a Concept and a Social Issue: A Multidimensional Developmental Theory,” Journal of Social Issues (33:3), pp 22-42.
- Lowry, P.B., Moody, G., Vance, A., Jensen, M., Jenkins, J., and Wells, T. 2012. “Using an Elaboration Likelihood Approach to Better Understand the Persuasiveness of Website Privacy Assurance Cues for Online Consumers,” Journal of the American Society for Information Science and Technology (63:4), Apr, pp 755-776.
- Pavlou, P.A. 2011. “State of the Information Privacy Literature: Where Are We Now and Where Should We Go?,” MIS Quarterly (35:4), Dec, pp 977-988.
- Smith, H.J., Dinev, T., and Xu, H. 2011. “Information Privacy Research: An Interdisciplinary Review,” MIS Quarterly (35:4), pp 989-1015.
- Tversky, A., and Kahneman, D. 1974. “Judgment under Uncertainty: Heuristics and Biases,” Science (185:4157), pp 1124-1131.
- Vance A, Elie-Dit-Cosaque C, Straub DW (2008) Examining trust in information technology artifacts: The effects of system quality and culture. Journal of Management Information Systems, 24(4):73-100.
- Xu H, Gupta S, Rosson MB, Carroll JM (2012) Measuring mobile users’ concerns for information privacy, Orlando, FL,
- Xu, H., Teo, H.H., Tan, B.C.Y., and Agarwal, R. 2010. “The Role of Push-Pull Technology in Privacy Calculus: The Case of Location-Based Services,” Journal of Management Information Systems (26:3), pp 135-174.