Customer relationship management (CRM) deals with the processes and systems that support business strategies to build long-term and profitable relations with customers (Ngai et al. 2009). The rapid development of the digital world has changed marketing models by transforming CRM practices and relationships between customers and companies. Easier access to customers’ online data (through social networks, search engine history, or through the creation of cookies and other tracking systems) allow companies to gather a huge variety of information about customers. Such access also allows companies to create cloud systems which gather and compile the data , and strategize and automate CRM practices.
How to Cite
Macnish, K., & Ana, F. I. (2019). Customer Relation Management, Smart Information Systems and Ethics. ORBIT Journal , 2 (2). https://doi.org/10.29297/orbit.v2i2.114
Advances in Artificial Intelligence (AI) and Machine Learning (ML) have also had a significant role in business development since the turn of the 21 st century, with many of these developments being incorporated into new CRM practices. Braun and Garriga (2018) argue that the ML approach to Big Data and CRM overcomes mere statistics or the classification of data and goes one step further towards systems which employ machine learning . This, they claim, results in an optimization and automation of processes in the process of purchasing goods and services. They state that, “with advanced data engineering, we blend data from all collected data sources…and build ML models that will accurately predict the propensity of a customer to churn [cease to be a customer of the company] based on the stored digital traces… Our models predict the propensity to buy a new product” (Braun A., Garriga G. 2018, p. 667-68). Thus, Big Data is having a significant impact on the ways in which customers are attracted to, and retained by, an organisation. It is therefore important to study the influence on and ethical implications of these developments for customers. This case study focuses on determining which ethical issues arise in the use of Smart Information Systems (SIS) in CRM and how can they best be addressed. This will be carried out by the study of the key issues within a literature review and by an interview with an employee working at the Finnish Telecommunications Company Company A. Company A currently uses CRM to maintain a strong customer base by engaging with customers throughout their contract, and especially when the contract is due to expire. At this key time for the company, customers may choose a contract with a different company (“churn”). CRM is also helpful in engaging with ongoing customer needs, such as maintenance and trouble-shooting. The aim of the case study is to identify what ethical issues arise from the use of SIS in CRM, whether companies that use SIS for CRM have policies and procedures in place for addressing these concerns, and if practitioners are facing additional issues not addressed in the current literature. The case study is divided into four main sections. Sections 1 and 2 focus on the literature review. Section one reviews the technical aspects of CRM, while section two presents a literature review on ethical issues within CRM. Section 3 focuses on the interview with the employee of Company A. Finally, section 4 critically evaluates ethical issues that have arisen in the use of SIS technologies in Company A’s use of CRM.
The Use of SIS Technology in Customer RelationshipManagement
This section details SIS technology and companies which are developing CRM practices to elucidate how this technology is used in practice. A literature review is conducted on the different ways that CRM is used and implemented.
|The goals of CRM are “acquiring, developing and retaining satisfied, loyal customers” (Hailwood, J., & Gottlieb, D, 2003).
Hailwood and Gottlieb present a background overview of CRM, and explain that the goals of CRM are “acquiring, developing and retaining satisfied, loyal customers” (Hailwood, J., & Gottlieb, D, 2003). They argue that it is desirable for the overall profit and development of a company to increase the number of profitable customers. It is thus expected that companies which implement CRM successfully will obtain the loyalty of customers. Chen and Popovich (2003) argue that a large part of contemporary CRM requires the implementation of technology. In a similar vein, Winer has charted how the technological revolution has changed the relationships companies have with clients. The most significant aspect of this is increased interaction and improved, customized experience for the customer, such that companies now have a better ability to “establish, nurture, and sustain long-term customer relationships than ever before” (Winer 2001, p.89; see also Blázquez, 2014; Parise et al., 2016; Soudagar et al., 2012). More recently, CRM software has been recognized to comprise the largest software market in the world, being used by 91% companies with 12 or more employees, and is estimated to reach a value in excess of $80bn by 2025 (Taylor, 2019).
Winer argues that the creation of a customer database is a necessary first step towards establishing effective customer-relationship management (Winer 2001, p.91). Thus, a feature of contemporary market practice has been that companies have acquired large datasets regarding stock and customers, enabling more effective practices of mass marketing. There are therefore a number of different strategies in CRM in which the analysis of data is being used. Ngai et al. (2009) identify four major CRM dimensions (ibid. p 2593):
|is the targeting of users who are most likely to become customers .
|involves direct marketing such as sending emails or coupons.
|incorporates understanding user satisfaction, including practices such as customer profiling and one-to-one marketing.
|includes historical analysis of data with the purpose of predicting future customer desires and behaviours (ibid. 2595).
With the increasing technical and financial ability to manipulate and interpret large datasets, these CRM strategies have employed data analytics, in which the analysis of customer data and behaviours allows the discovery of new and valuable information from customers (ibid). Braun and Garriga (2018) affirm this and note that the development of CRM is reflected in questions about consumers which are based on hypotheses derived from data analysis, such as:
- “ Consumer profile —by which category can a distinct consumer be described?
- Interest —what is the specific consumer interest?
- Next best product —what is she most likely to buy next?
- Cross- and upsell —what other product or services can be offered?” (Braun A., Garriga G. 2018, p.665).
CRM has therefore become an online behavioural targeting (OBT) practice which comprises the monitoring and tracking of customers’ online behaviour as well as the use of collected data to individually target and understand specific customer segments (Boerman, Kruikemeier, &ZuiderveenBorgesius, 2017). This has been particularly significant in areas where customer data was collected prior to developments in computing ability. Banks and telecommunications companies, for example, have standardly gathered large amounts of customer information since their inception. Writing in 2001, Winer offered several examples of different companies’ database practices that were already well established at the beginning of the 21 st century.
|had created a global database from their global operations storing records of e-mails, contacts, and locations.
|a golfing equipment company, gathered more than 1.5 million golfers’ names, together with addresses and even vacations and birthdays.
|the bookseller, prior to its collapse in 2011, collected all of its customer information into a single database and sent emails tailored to each customer’s reading interests.
|a British tour operator, gathered information about individual expenses from customers which enabled them to discover which customers were more profitable (Winner 2001, p.92-97).
Ethical Issues of Using SIS in CRM
Data analytics is a growing business that has changed the traditional market to a digital one involving the storage and analysis of large quantities of data to offer predictions based on users’ behaviour (Braun 2016). Developments in SIS technology pose new challenges and questions on the continuous collection and analysis of data and the impact this can have on consumers. SIS can be seen as a powerful tool that gives new insights and addresses issues such as terrorism or health and disease (Moorthy et al. 2015, p. 75), as well as providing more personalised service (Macnish 2018, p114-17). However, Big Data also risks decreasing user privacy and increasing the potential for companies to control and manipulate customers.
- The possibilities for CRM that come with Big Data analysis raise important issues such as:
- ownership of data ,
- how data can be shared and remain protected,
- how companies can prove the use of data to their customers, and
- how customers can take greater control of their data (Braun A., Garriga G. 2018, 671-672).
In 2018, The UK Institute of Business Ethics published an article discussing fundamental values and principles for the use of AI in business: accuracy, respect of privacy , transparency and openness, interpretability , fairness, integrity, control, and impact (IBE 2018). A literature review demonstrates that a number of ethical issues are being discussed within the academic community. These include autonomy , control, manipulation, privacy , users’ knowledge, responsibility , trust , and biases (see below). This section will therefore present a literature review of the ethical and societal issues arising from the use of SIS technology in CRM practices. The literature review was carried out through a combination of online search using generic engines such as Google and Google Scholar, and discipline-specific search engines on websites such as PhilPapers.org and the Philosophers’ Index. The bibliographic references of elected papers were then used to locate further literature. Generic searches on Google also provided links to trade publications and websites that were a further source of background information.
Autonomy, Control and Manipulation
There is a concern among some authors that the autonomy of customers is being undermined through the employment of data analytics. Weston questions whether data analysis techniques are able to measure virtue, cautioning that such analytics are not a neutral tool or measurement: they can expand, constrain or alter people’s choices and behaviours, each of which has an influence over the user. He suggests that data analytics treats individuals as already having made or being inevitably about to make certain choices, often with a moral component (Weston 2016, p.38 – 40), but challenges this by suggesting that virtue is “too broad and flexible to be measured” (ibid, p.33). Weston demonstrates that predictions can be a way of giving strong recommendations towards choices and desires, as well as a means to follow and pressure customers around the Internet, leaving them with less autonomy to decide for themselves. Weston’s arguments suggest that there are two sides to data predictions: the accurate prediction of a desire may at one and the same time involve the constraint of choice (ibid p. 37-38). Boyd and Crawford (2012) argue that design decisions determine what will be measured for analysis. This implies that the designers of the analytics systems are making decisions, whether they realise it or not, about which attributes and variables will be counted and which will be ignored. As a general reflection on the design of data analytic systems, this has an impact on the design of such systems used for CRM. Harrison and Grey (2010) have exposed a correlation between consumer debt and financial institutions’ access to information. The authors show how a rise in consumer debt occurred at the same time as the development of complex marketing profile methods that included neural networks and predictive models to target consumers (Harrison and Grey 2010, p. 437). They explain how models that predict a consumer’s likelihood of bankruptcy are used to evaluate the profitability of a client, instead of preventing consumers from making harmful financial decisions. For example, new SIS practices can target people who are financially vulnerable and therefore more susceptible to take out a loan (ibid, p.438). Based on these insights, the authors argue that current consumer policies are insufficient to protect susceptible consumers from new direct marketing techniques that use customers’ information to exploit and manipulate them (ibid). Hence SIS technology generally, and by extension that used in CRM practices, has the potential to restrict choice and modify the behaviour of customers. This amounts to the possibility of restricting autonomy and controlling some consumer choices.
Privacy and Knowledge
Internet access using platforms such as Google, Facebook, or Twitter allows for personal information to be disclosed to the platform and other websites, even if customers do not want to reveal this information. Therefore, privacy issues are a significant concern for CRM practices: “people unknowingly and unintentionally are communicating personal data to someone else. For marketers Big Data is a powerful weapon for capturing consumer data directly, indirectly, unobtrusively, with and without permission and participation” (Moorthy et al, 2015, pp. 92-93). Moorthy et al. mention the well-known example of Target, a company which developed an algorithm to find pregnant women based on users’ previous activities online (ibid p.92; see also Duhigg, 2012), and the use of facial recognition in social networks to identify people in dating sites (ibid p. 93). Similarly, Braun and Garriga (2018) demonstrate the challenges of using data analytics and explain how anonymous data can lead to uncovering individuals’ identity (Braun A., Garriga G. 2018, p.666). Data anonymity is no longer sufficient because by enriching previously anonymous data with external or historical data , personal information may be reconstructed by, for example, finding a person’s location hourly by analysing different entry points (De Montjoye, et al. 2013). Ghosh and Moorthy (2015) also argue that the automation of marketing practices reduces transparency such that the opportunity of exposing sufficient knowledge to customers for informed critique is reduced. For example, the cookie consent procedure introduced by the European Parliament (EU Parliament, 2002) is not enough for a new user to understand the complex mechanisms of data gathering and what a company may be using those data for (ibid, p.93). Data can be stored for longer periods of time and might be put to different purposes. In addition, the authors emphasize that there are important risks to security breaches, such as credit card fraud , given that the data are generated from multiple and different points, rendering them difficult to secure (ibid). Similarly, Braun and Garriga (2018) affirm that the nature of Big Data is to store the data for later use despite often not knowing what that will lead to, which can lead to consumers being uninformed about data usage (Braun A., Garriga G. 2018, p.671).
A further challenge to the collection and use of data for CRM and elsewhere is the possible lack of informed consent given by the consumer for data to be used (Foster and Young, 2011). Data may have been given initially for a particular reason, and yet, as noted above, this reason may be replaced by other interests at a later date. In such cases the company may hold the data but not have received informed consent for its use to the latter end. In the case of CRM, the data is collected by the company as customers subscribe to services, but customers may not always consent to having their data analysed for insights or profiling . Even when consent is given, as for instance in the case of store loyalty cards, the apparent means of gaining consent may consist of merely ticking a box to say that you agree to the terms and conditions. However, it is widely accepted that very few people ever read these terms and conditions, and as such the validity of that consent is questionable.
The problem of consumers not reading these terms and conditions to which they give consent is heightened when companies feel the need to protect their terms and conditions in legalistic jargon to avoid class action lawsuits. In such cases, even when a customer attempts to engage with the terms and conditions they may find that they are unable to understand these.
While the centrality of informed consent is well established in some fields such as medicine and academic research, it is less so in marketing. Prior to the introduction of the General Data Protection Regulation (EU Parliament, 2016) and in areas outside the jurisdiction of the EU, a response sometimes raised is that there is no harm involved in the collection and use of such data for purely marketing purposes and so consent is less important than in the aforementioned fields (Hill, 2014). Yet the reason for gaining informed consent from customers, or indeed anyone, is both to limit harm and to respect the autonomy of the individual to participate (Beauchamp, 2009; Manson and O’Neill, 2007).
Concerns of bias entering into AI and ML have been growing since at least 2010, and have been well documented by Cathy O’Neil, amongst others (Mittelstadt et al., 2016; O’Neil, 2016). Despite surface assumptions that computers are unbiased as they do not recognize e.g. skin colour or gender, increasing research has been conducted evidencing the potential for the outputs of automated systems to be prejudiced against certain groups of society (Macnish, 2012). This may come from ignoring culturally-specific practices such as clothing choices or walking behaviour, or from drawing understandings of the norm (from which any deviance is recognized by the automated system) from dominant cultures. Within CRM this might lead to certain groups of society being either ignored, or targeted for particular attention, for no reason other than that the determining program was biased. Ultimately this can lead to legal issues if some groups receive too much or too little attention and respond negatively towards the company as they perceive this to be discriminatory. An example of this would be when a high-end store uses a loyalty card to maintain contact with regular customers. Given the nature of the store, regular customers are likely to be more wealthy than average. To maintain good customer relations, customers with loyalty cards are offered benefits to “thank” them for their loyalty. These might come in the form of free drinks at the in-store bar, special viewings of sale items, or exclusive savings. In this way, regular (wealthier) customers end up paying less for items in the store than irregular (poorer) customers, potentially exacerbating wealth disparities in society. An alternative example is that of searching for hotels on an Apple computer or a Windows computer. According to a Wall Street Journal article in 2012, the hotel search engine Orbitz would steer Apple users towards pricier hotels on the assumption that a person who used an Apple was wealthier than one using Windows (Mattioli, 2012). In this case, price discrimination is taking place with a lack of transparency in such a way that people are benefitting from (or being penalised for) their perceived wealth.
Responsibility and Trust
The unequal power relationship between companies and consumers creates new concerns of accountability , and data ownership (Braun A., Garriga G. 2018). Thus, responsibility for the data is a key concern; for example, the credit risk based on social network profiles has been proven to be inaccurate (ibid p. 667). Data protection and company responsibilities are important for any organization that runs a business built on trust , in which frameworks such as privacy by design are necessary. The following recommendations have been made for business through Privacy by Design (Braun A., Garriga G. 2018, p. 672-673):
- Data sharing with the authorization of the owner ( consent ) of the data for a specific time frame and for specific purpose;
- Data limits for the purpose of the request;
- Not using data against customers, data should be used for the only purpose established between the owner and the business at the moment of applying data services;
- Confidentiality and data not to be shared with third parties;
- Necessity and proportionality in which business should only store and process data that is necessary for its services and products.
Case Study: a Telecommunications Company using CRM SIS
It is important to evaluate whether the ethical issues raised in the literature correspond to those addressed by practitioners. In order to do this, this section focuses on Company A, a telecommunications company in Scandinavia, and its use of CRM SIS in practice. Company A uses SIS technology in online marketing practices to improve productivity and reduce churn. It is a telecommunications, ICT and digital services company operating mainly in Scandinavia and Estonia that provides environmentally sustainable services for communication and entertainment, and the tools for organizations to digitalize their operations and improve productivity. Background research about the use of SIS technology by Company A was conducted, and a Company A staff member interviewed. The staff member is responsible for data analysis as part of the group which handles data security and privacy issues. The interview was conducted in October 2018, at Company A’s Headquarters in Scandinavia. During the interview, the uses of SIS were discussed, and the most fundamental issues corresponding to this technology were reviewed. The qualitative analytics software tool (NVIVO) was used to categorise, define, and evaluate the content of the interview. The interview conducted at Company A was then segmented and categorised within these nodes, which were analysed to produce this report.
Description of SIS technologies being used in Company A
Company A serves more than 2.8 million customers who account for over 6.2 million subscriptions. The company has become a market leader and presents itself as a pioneer in new network technologies and innovations, such as 5G. Company A cooperates with Vodafone and Telenor, among others, to enable global services. The company is listed on the Nasdaq Helsinki Large Cap with approximately 190,000 shareholders. In 2017, revenue was 1.79 billion euros, and the company employed 4,700 people in 13 countries. Among the regional competences of the group dealing with data analytics, there are three aspects: cybersecurity development, video conference and contact centres. In addition, Company A also works with different operational models and subsidiaries. Company A has recently introduced a system designed to monitor and manage assets (customers) and infrastructures; the data it retrieves goes to the study of performance and trending. The principles included in this system are real-time advice and interfacing, Big Data management, IoT Device management and control, process and workflow automation, abstracted visualization, and machine learning . The interviewee explained that the application of SIS is simple: it mainly refers to rule-based systems that enable the handling of customer data to improve systems and services. The purpose of the technology is network optimisation and the unification of customer identification across platforms. Company A creates a common digital identity (ID) for a user from different databases to a common master ID, which helps to identify a person in as many ways and places as possible. This is possible through the improvement of machine learning (ML), where Company A employs data -scientists to create matching algorithms, and a product which is an ML automatic optimization network that helps to, for example, reduce phone calls to customers by finding how likely a customer is to answer their phone call when Company A is conducting outbound calls. Thus, while accuracy is important, for Company A better processes and more suitable solutions for customers are needed. Company A is also working with AI, where a number of different experiments have been tried, such as reducing customer-care contact. Currently Company A is very good at estimating whether the customer will make contact again: in the case of billing, for example, the company has found that a typical customer will generally make contact after seven days. Moreover, Company A uses large datasets to determine the profitability of a subscription package when matched to a particular customer. This then enables the company to avoid being “gamed” by customers. It has also implemented a chatbot , which is a text-based machine operator, by which customers communicate with a machine instead of talking with a human. The chatbot is an option for customers who go online to engage with the company, to filter people whose problems can be resolved relatively easily from those whose difficulties require a human to engage with. When it is used, the chatbot is always displayed to be a robot, so there is no deception on the part of the company in pretending that customers are speaking to a person when they are not. Furthermore, customers are given the choice whether they would prefer to speak to a person or a chatbot , so consent and personal choice are respected. The benefits of using chatbots in this way are that they free up customer support to focus on more challenging issues while the automated system can resolve relatively straightforward issues. The algorithms used by Company A allow for pricing decisions to be made based on modelling, which create assumptions about customers. The interviewee explained that customers are treated differently if, for example, a customer has a higher mobile score. However, the interviewee also held that Company A does not engage in price- discrimination . In the future, the interviewee noted, modelling scores could become a tool to find the optimal price to keep a customer subscribing. Company A operates in an industry where the tools of the future are built through continuous development, innovation and cooperation between stakeholders in different fields. For this reason, the company is also closely involved in research projects and startup activities in the industry. The interviewee explained that the company is looking for future projects such as voice-recognition to identify customers and save time in gaining customer data in a phone call. While it is relatively easy to recognise a person online, it is not as simple within a 30 seconds call. Hence Company A are looking into the storage of voice-samples from customers, from which it will be easier for the customer to be recognized instantly while talking with an agent . In addition, the interviewee considered the benefits of the possible use of cameras on set-top boxes (STB) to ensure that a person watching TV is not underage.
Table 1: Case study data
The Effectiveness of Using SIS for Company A
The effectiveness of SIS for Company A relies on its understanding of the reasons for its “churn score”: the number of customers ending their subscriptions. The aim of Company A’s model is to further increase customer orientation and cost-efficiency. For example, the interviewee explained how Company A asks customers for consent to combine their browsing data with CRM data , which enables Company A to know if they have a problem beforehand from the browsing behaviour, for example, if they added something to a basket but they have not finished the purchase. In addition, if a customer bought a mobile router from Company A but the customer is browsing that router online, it is probably because they do not know how to use it. SIS has also been useful in saving time for customers, keeping customers interested in subscribing to Company A, and even finding criminal behaviour. It is therefore evident that the use of SIS brings numerous business and customer benefits. However, the company has also faced problems such as the unification of identities: the interviewee stated that there are 11 million different customer accounts. The interviewee saw it as imperative that the company improves customer identification across different systems such that each individual entity (person or company) has a unique identity.
Ethical Analysis of CRM Using SIS
The interview conducted at Company A and the background research highlighta number of ethical issues as a result of using SIS technology in CRM practices. These issues widely reflect those found within the literature as discussed in section 2 of this paper. There is therefore a great deal of correlation between academic understanding of the issues with those working in the industry. There is also a relation between the issues faced with CRM technology and the difficulty of dealing with these problems. For example, the power and the potential of gathering data affect the consumer in ways that might have not been predicted. This is a concern which the industry, including Company A, agree upon but can find no easy solution. Questioning how much quantity of data is necessary to store is a challenging and complex question to answer effectively. Thus, there is a need to start the dialog between the ethical and technical fields.
Intrusion and Regulatory Differences
Company A is able to ask customers if they can combine their browsing data with CRM data , which enables identification of a problem from their browsing behaviour. This, as the interviewee explained, can be very intrusive. However, Company A seeks consent from customers and the terms and conditions are visible on the website. The interviewee pointed out that tracked browsing behaviour relates only to that on Company A’s own website, and not data gathered from other websites or third parties like Google. However: “The question still remains as to what kind of benefits the user gets out of it.” (The interviewee) The interviewee also argued that Company A does not need to store excess information beyond that which is already being used. “Thinking from customer’s perspective: if something doesn’t benefit the customer, then there’s no point doing it either” (The interviewee) Company A uses data for modelling in order to find the products that a customer is most likely to buy. In this way, the interviewee argued, the service level becomes much better. The interviewee also discussed the possibility of storing data even if it is not useful, just because in the future it might be. In China, the interviewee explained, the usefulness of data gets to a different level. Chinese data collection tends to be more intrusive than European, so there is a need to assess the cultural and legal parameters of different markets. The interviewee mentioned the difference between the impact of GDPR in Europe and the situation in other countries, where GDPR does not apply. Depending on the country and the culture, laws are different in a way which influences competitive advantage. For example, it would be comparatively easy for a company based and operating outside of the EU jurisdiction to develop a dataset that would be unethical to produce in Europe. On the basis of this dataset, the company could then develop a highly effective SIS which could then be sold in Europe. This would be economically advantageous to that non-EU company, provided no personal data collected in this manner ever entered the EU. The interview felt that, particularly in the light of GDPR companies in Europe both have to and should protect the privacy of European citizens, but that it is incumbent on the EU Parliament to make sure that those companies can continue to compete in the international market, and especially with different parts of the world which espouse different ethical values. Therefore, questions need to be raised about the possibility of the EU placing restrictions on the sale of SIS products which cannot provide evidence of ethical development.
Privacy and Security
As noted above, privacy issues are a concern, as is the lack of awareness that people have about certain devices, such as mobile phones and smart TVs, being recorded. The interviewee explained that there have been suggestions of putting cameras and microphones in set top boxes, which would invade the privacy of users’ homes. In addition, the interviewee pointed out that it is currently possible to gather and analyse TV watched based on subtitle data without the need to carry out video or audio analysis. However, any algorithmic categorization of people based on content watched will prove difficult to use in predicting behaviour as two very different people can watch similar programmes on TV. The interviewee further argued that GDPR can serve to raise awareness about the importance of privacy : “[ GDPR can encourage] an awareness among people towards asking what data is being collected from my behaviour? How much data does Facebook (for example) have from me and how much benefit is there in that for me?” (The interviewee).
Bias and Manipulation of Data
The interviewee argued that ethical concerns in physical systems are relatively easy to recognize, but with digital systems and algorithms an entire system can become biased in a way that is difficult to recognize, for example in cases of gender discrimination . Machine learning systems (MLs) learn from what humans teach them, however many people make the assumption that the information provided by an automation system is correct and unbiased; people trust these systems even if they are proven to contain false positives or false negatives. Furthermore, the manipulation of customers is possible through the lack of transparency in how these systems function. Informed consent is an important issue, as noted in the literature review, but there is often a lack of clarity on exactly what informed consent allows in particular cases. The interviewee expressed concern about reports on the behaviour of Cambridge Analytica and fake news, and questioned how to educate people in a way that they become aware of this type of manipulation. “this should be a part of the education, even if people understand much more how this manipulation is taking place nowadays, to recognize when they are being manipulated is a valuable skill” (The interviewee) This also shows the need to increase awareness amongst the general public, as individuals should be more capable of identifying when they are being manipulated. Furthermore, the interviewee affirmed: “Democracy will be broken as long as people don’t know when they are being manipulated” (The interviewee) The interviewee expressed how they personally worries about an increasingly Orwellian society and all the surveillance that could be carried out with the kind of data Company A collects, particularly if governments have access to this data . The interviewee held that in China the public knows that surveillance is taking place; in contrast, in the Western world most people do not know really what surveillance is being performed, nor who is conducting it. However, the interviewee did not produce any evidence to support this claim.
Responsibility and consent
During the interview the issue of responsibility was raised, in particular the consideration of who has responsibility for collecting data , overseeing that data collection, and informing the public as to what happens to the data : “Should it be related to individual education or corporate responsibility ? ” ( The interviewee ) The interviewee mentioned that Company A has public policies for handling customers’ data ; a code of conduct (Company A, 2015) and have trained data scientists in this area. In the company it is possible to download the personal data that is stored for each individual subscriber as well as managing the consent from customers. This relates to the company’s responsibility . The interviewee argued that Company A can always do better at informing customers, analysing how well a customer has read and understood the consent . As an example, the interviewee mentioned the Google-inspired mindset of “don’t do evil”. The interviewee also felt that governmental organizations should be far more regulated than is currently the case.
Transparency and the company’s vulnerability
It is frequently difficult to make algorithms public, and most customers would not have the ability to understand those algorithms or the input data needed to run them. Furthermore, the algorithms change quickly and models are constantly being updated. From this, one can see how the notion of public “acceptance” of certain algorithms does not really work. The interviewee argued that for a company, it can be the case that the more open you are, the more vulnerable you are. From a financial perspective, the company has created a model using data to determine, for example, how profitable a customer is. However, Company A is not yet able to utilize that data to make decisions based on algorithms. Nonetheless, traditional rule-based systems render it easier for customers to game the system, giving them more power and control over the company’s offers. This suggests that as regards new CRM practices it would clearly be in the company’s interests that the algorithms not be transparent. As the interview said: “we might be there on a couple of years, and this might affect the capacity of people to use that in their benefit, and the company will be less vulnerable” ( The interviewee ) It was argued that GDPR has raised general awareness of data -related privacy issues and led to discussions of personal data collection. It has encouraged people to question what is being collected and why. As a result, the organisational policies in place to address the use and handling of customer data have been made transparent to the user, who can request data from the Company A website. Similarly, the company’s code of conduct (Company A, 2015) is also available on the website. The following values are included in the code of conduct: consumer orientation, responsibility , renewal, result orientation, and collaboration. The website also mentions the importance of maintaining the trust of the customer, and the importance of confidentiality in managing customer data . However, this doesn’t always sit easy with the problems raised above regarding the desired opacity in at least some employment of algorithms as well as the challenge of codes becoming out-dated.
There is a need to protect customers from being exploited and misguided through data manipulation and analysis. New advances in CRM practices and data technologies bring social benefits, but also ethical issues that must be examined. This case study demonstrates how companies are using SIS in CRM strategies and their social and ethical implications. The main purpose of this case study was to uncover which ethical issues arise in the use of SIS in CRM. Based on the literature review, the following issues were identified: autonomy , control, manipulation, privacy , knowledge, informed consent , bias , responsibility and trust .
|Manipulation of people
|Manipulation of people
|Lack of knowledge by customers
|Different ethical values globally
Table 2: Comparison of ethical issues between literature and case study The interview conducted at Company A offered a recent perspective on the real-world applications of data management, and also addressed these issues; this case study raises a number of concerns with SIS in CRM:
- intrusion and questions over the usefulness of data
- issues of manipulation and the lack of customers’ autonomy ;
- the potential for biases in CRM analysis because it is not value -free;
- the cultural differences of ethical issues in different countries;
- the lack of knowledge and awareness by the customer regarding the collection and use of their information; and
- the necessity of companies taking responsibility for data collection and use.
One category that was not mentioned in the literature but was in the interview is that of cultural differences in an international setting. This relates to the fact that there are different laws in different countries, and what is considered to be ethical also changes, which has implications for competition in the international market. Thus questions regarding the potential ongoing implications of international policies and laws are worth considering. This is of particular concern regarding cases where non-European countries may collect data and develop algorithms. In this way, algorithms may be designed and developed outside the EU in a manner incompatible with European regulations such as GDPR . Nonetheless, as explained above, the algorithms could then be employed in Europe in competition with European-developed algorithms. The corporate concern, at least in this instance, is hence that European companies will be unable to develop competitively powerful algorithms, given the limitations on data collection resulting from GDPR . The power and potential of CRM data gathering lead to a variety of ethical issues. Overall, there is a need to address ethical issues in the technical sector and the importance of companies’ responsibility in this vastly changing CRM area. The literature review demonstrates that there are questions that arise during the case study which have not been considered in academia, due to the novelty and difficulty of these practices.
There is a dearth of literature specifically addressing ethical issues arising from SIS use in CRM. The focus in applied ethics has tended to be towards social media and insurance uses of algorithms, but CRM in a more general context has not received the attention it merits. Much of the research providing the background literature review was drawn from papers discussing ethical challenges with SIS in general rather than specifically in cases of CRM, and there are few CRM-related examples. While the interview has raised new ethical issues not covered in the literature, it only engaged with one interviewee from one company. The insights gained above are significant, but they could be supported or challenged by engaging with further interviewees and in other companies.
Contribution to Knowledge
The research outlined in this report brings several contributions to the state of knowledge regarding the use of SIS in CRM practices. This case study not only offers an addition to current literature and business practices, but also a study for policymakers with new insights into these developing technologies. This case study offers a literature review of the most pertinent ethical, social and legal issues of CRM. It also presents an interview with an employee working in CRM for one of the largest telecommunications companies in Scandinavia. The interview demonstrated that the academic literature largely raises issues of which practitioners are aware through day-to-day involvement with SIS. One element of particular concern for the interviewee which has not been explored in depth in the academic literature is the global element of ethical values when it comes to developing SIS. There is a real concern that those companies which develop SIS in a more ethical manner (by, for instance, adhering to laws such as GDPR ) may suffer from this. While this should not be a concern in the short term, it may lead to the eventual replacement of those companies by others who are able to compete through not following (and not seeing the need to follow) the same ethical restrictions.
This case study is based on a single interview with a single company. While the information gained from that interview is insightful, it is clear that further interviews should be held with other businesses. These would help to determine whether the issues raised here are shared more widely or are unique to the company interviewed, or indeed whether there are other relevant ethical issues not explored here. It would be valuable to pursue the concern that GDPR and other European regulations place European companies at a competitive disadvantage relative to non-European companies, who are able to collect data (and hence develop algorithms) with fewer restrictions. If this is the case, then work should be carried out to determine how best to protect European companies from being harmed for acting in an ethical and respectful manner. This case study has also shown that the nature of CRM technologies is in a state of continuous development. Additional developments will continue in the future, such as algorithms, AI and ML, and further ethical analysis will be required to monitor their implementation in CRM practices. To this end, there is a need for ongoing dialog between the ethical community and the CRM community as to which business practices should be pursued in the future.
Beauchamp, T.L., 2009. Autonomy and Consent, in: Miller, F., Wertheimer, A. (Eds.), The Ethics of Consent: Theory and Practice. OUP USA, Oxford ; New York, pp. 55–78. Blázquez, M., 2014.
Fashion shopping in multichannel retail: The role of technology in enhancing the customer experience. Int. J. Electron. Commer. 18, 97–116. Braun, A., &Garriga, G. (2018).
Consumer Journey Analytics in the Context of Data Privacy and Ethics. In Digital Marketplaces Unleashed (pp. 663-674). Springer, Berlin, Heidelberg. boyd D. & Crawford K., (2012).
Critical Questions for Big Data . Information, Communication & Society, Routledge 15:5, 662-679, DOI: 10.1080/1369118X.2012.678878 Boerman, S.C, Kruikemeier, S., &ZuiderveenBorgesius, F.J. (2017).
Online Behavioral Advertising: A Literature Review and Research Agenda. Journal of Advertising , 46(3), 363-376. doi: 10.1080/00913367.2017.1339368 Cadwalladr, C., Graham-Harrison, E., 2018.
Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Cambridge Analytica, 2017. Cambridge Analytica. URL https://capolitical.com/?__hstc=163013475.3e49f3d22529e528b7209670352a2cae.1543235451820.1543235451820.1543235451820.1&__hssc=163013475.9.1543235451821&__hsfp=2315517664 (accessed 1.4.17).
Chen, I. J., & Popovich, K. (2003). Understanding customer relationship management (CRM) People, process and technology. Business process management journal , 9 (5), 672-688. Crawford, K., 2013.
The Hidden Biases in Big Data [WWW Document]. Harvard Business Review. URL https://hbr.org/2013/04/the-hidden-biases-in-big- data (accessed 1.4.17). De Montjoye, Y. A., Hidalgo, C. A., Verleysen, M., & Blondel, V. D. (2013).
Unique in the crowd: The privacy bounds of human mobility. Scientific reports, 3, 1376.ISO 690 Dubois, L. (2015).
11 Best Web Analytic Tools. Retrieved from: https://www.inc.com/guides/12/2010/11- Duhigg, C., 2012.
How Companies Learn Your Secrets. The New York Times. Ebeling, M., 2016. Healthcare and Big Data: Digital Spectres and Phantom Objects. Springer.
EU Parliament, 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data , and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance), OJ L. EU Parliament, 2002. Directive 2002/58/EC [WWW Document]. Off. J. 201 31072002 P 0037 – 0047. URL https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:EN:HTML (accessed 1.22.19). Enchautegui, M.E., 2013.
Nonstandard Work Schedules and the Well-being of Low-Income Families (No. Paper 26). Urban Institute, Washington DC. Foster, V., Young, A., 2011. The use of routinely collected patient data for research: A critical review. Health 16, 448–463. https://doi.org/10.1177/1363459311425513 Harrison, P., & Gray, C. (2010).
The ethical and policy implications of profiling ‘vulnerable’ customers. International journal of consumer studies , 34 (4), 437-442. Hailwood, J., & Gottlieb, D. (2003).
U.S. Patent Application No. 09/972,277. ISO 690 Hill, K., 2014. Facebook Manipulated 689,003 Users’ Emotions For Science [WWW Document]. Forbes. URL http://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/ (accessed 9.24.14). IBE (2018). Business Ethics and Artificial Intelligence . [online] Ibe.org.uk. Available at:https://www.ibe.org.uk/userassets/briefings/ibe_briefing_58_business_ethics_and_artificial_intel ligence.pdf [Accessed 20 Oct. 2018]. Injazz J. Chen, Karen Popovich, (2003).
Understanding customer relationship management (CRM): People, process and technology”, Business Process Management Journal, Vol. 9 Issue: 5, pp.672-688, https://doi.org/10.1108/14637150310496758 The interviewee. Company A, “Interview”, Scandinavia, 10/03/2018. Macnish, K. (2018).
The Ethics of Surveillance: an introduction. Routledge: London. Macnish, K., 2012. Unblinking eyes: the ethics of automating surveillance. Ethics and Information Technology 14, 151–167. https://doi.org/10.1007/s10676-012-9291-0 Manson, N., O’Neill, O., 2007. Rethinking Informed Consent in Bioethics. Cambridge. Mattioli, D., 2012.
On Orbitz, Mac Users Steered to Pricier Hotels. Wall Street Journal. Mittelstadt, B.D., Allo, P., Taddeo, M., Wachter, S., Floridi, L., 2016. The ethics of algorithms: Mapping the debate. Big Data & Society 3, 2053951716679679. https://doi.org/10.1177/2053951716679679 Moorthy, J., Lahiri, R., Biswas, N., Sanyal, D., Ranjan, J., Nanath, K., & Ghosh, P. (2015).
Big data : prospects and challenges. Vikalpa , 40 (1), 74-96. Ngai, E. W., Xiu, L., & Chau, D. C. (2009). Application of data mining techniques in customer relationship management: A literature review and classification . Expert systems with applications , 36 (2), 2592-2602. Nguyen, B., Simkin, L., &Canhoto, A. I. (Eds.). (2015). The Dark Side of CRM: Customers, Relationships and Management . Routledge. Ohm, P., 2009.
Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization. UCLA Law Review 57, 1701–1777. O’Neil, C., 2016.
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown/Archetype. Parise, S., Guinan, P.J., Kafka, R., 2016.
Solving the crisis of immediacy: How digital technology can transform the customer experience. Bus. Horiz. 59, 411–420. Robinson, D., Yu, H., Rieke, A., 2014. Civil Rights, Big Data, and Our Algorithmic Future. Upturn.Blázquez, M., 2014.
Fashion shopping in multichannel retail: The role of technology in enhancing the customer experience. Int. J. Electron. Commer. 18, 97–116. Cadwalladr, C., Graham-Harrison, E., 2018. Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Cambridge Analytica, 2017. Cambridge Analytica [WWW Document]. URL https://capolitical.com/?__hstc=163013475.3e49f3d22529e528b7209670352a2cae.1543235451820.1543235451820.1543235451820.1&__hssc=163013475.9.1543235451821&__hsfp=2315517664 Company A, 2017.
On Company A [WWW Document]. URL https://corporate.Company A.com/on-Company A/ (accessed 12.4.18). Company A, 2015.
Company A Code of Conduct. EU Parliament, 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data , and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance), OJ L. EU Parliament, 2002.
Directive 2002/58/EC [WWW Document]. Off. J. 201 31072002 P 0037 – 0047. URL https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:EN:HTML (accessed 1.22.19). Hill, K., 2014.
Facebook Manipulated 689,003 Users’ Emotions For Science [WWW Document]. Forbes. URL http://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/ (accessed 9.24.14). Parise, S., Guinan, P.J., Kafka, R., 2016.
Solving the crisis of immediacy: How digital technology can transform the customer experience. Bus. Horiz. 59, 411–420. Soudagar, R., Iyer, V., Hildebrand, V., 2012.
The customer experience edge: technology and techniques for delivering an enduring, profitable, and positive experience to your customers. McGraw-Hill. Taylor, M., 2019.
18 CRM Software Statistics for 2019 by SuperOffice. CRM Blog Artic. Tips Strateg. SuperOffice. URL https://www.superoffice.com/blog/crm-software-statistics/ (accessed 4.1.19). Weston, H. (2016).
Data analytics as predictor of character or virtues, and the risks to autonomy . International Review of Information Ethics , 24 (05). Winer, R. S. (2001). A framework for customer relationship management. California management review, 43(4), 89-105. ISO 690