In: Operations Management
1. What is Big Data? Why Is Big Data
Different? (from data mart, data warehouse)
2. What Are the Benefits of Big Data?
3. Some of the potential business benefits
from implementing an effective big data analytics
4. How can organization leverage Big Data?
For example, Big Data can be used to develop the next
generation of products and services. For instance,
manufacturers are using data obtained from sensors embedded in
products to create innovative after-sales service offerings such as
proactive maintenance to avoid failures in new products.
5. Traditionally, factories estimate that a
certain type of equipment is likely to wear out after so many
years. Consequently, they replace every piece of that technology
within that many years, even devices that have much more useful
life left in them. Big Data tools do away with such unpractical and
costly averages. The massive amounts of data that they access and
use and their unequalled speed can spot failing grid devices and
predict when they will give out. The result: a much more
cost-effective replacement strategy for the utility and less
downtime, as faulty devices are tracked a lot faster.
6. What can you say about how aviation
companies are utilizing Big Data – check your reading of IBM white
paper and how OEMs are transforming the aviation industry.
1.ANSWER
A Definition of Big Data
SAS perfectly captures Big Data as “a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis.” But, as SAS points out, the amount of data is not as important as what organizations do with it: analyzing Big Data results in the insights you need to make better business decisions and strategic moves.
Lisa Arthur, Teradata Applications CMO and Forbes contributor, explains that Big Data “is a collection of data from traditional and digital sources inside and outside your company that represents a source for ongoing discovery and analysis.” She asserts that traditional data must be included in Big Data because it is an important piece of the Big Data picture. Indeed, incorporating data from all sources is key to optimizing the insights gained with Big Data.
In recent years, there has been a boom in Big Data because of the growth of social, mobile, cloud, and multi-media computing. We now have unprecedented amounts of data, and it is up to organizations to harness the data in order to extract useful, actionable insights. But, because traditional systems cannot store, process, and analyze massive amounts of unstructured data, organizations are turning to Big Data management solutions to turn unstructured data into the actionable data needed for gaining key insights into their business and customers.
Big data is different because of following reasons
These days, many people in the information technology world and in corporate boardrooms are talking about “big data.” Many believe that, for companies that get it right, big data will be able to unleash new organizational capabilities and value. But what does the term “big data” actually entail, and how will the insights it yields differ from what managers might generate from traditional analytics?
There is no question that organizations are swimming in an expanding sea of data that is either too voluminous or too unstructured to be managed and analyzed through traditional means. Among its burgeoning sources are the clickstream data from the Web, social media content (tweets, blogs, Facebook wall postings, etc.) and video data from retail and other settings and from video entertainment. But big data also encompasses everything from call center voice data to genomic and proteomic data from biological research and medicine. Every day, Google alone processes about 24 petabytes (or 24,000 terabytes) of data. Yet very little of the information is formatted in the traditional rows and columns of conventional databases.
Many IT vendors and solutions providers use the term “big data” as a buzzword for smarter, more insightful data analysis. But big data is really much more than that. Indeed, companies that learn to take advantage of big data will use real-time information from sensors, radio frequency identification and other identifying devices to understand their business environments at a more granular level, to create new products and services, and to respond to changes in usage patterns as they occur. In the life sciences, such capabilities may pave the way to treatments and cures for threatening diseases.
Organizations that capitalize on big data stand apart from traditional data analysis environments in three key ways:
1. Paying attention to flows as opposed to stocks
There are several types of big data applications. The first type supports customer-facing processes to do things like identify fraud in real time or score medical patients for health risk. A second type involves continuous process monitoring to detect such things as changes in consumer sentiment or the need for service on a jet engine. Yet another type uses big data to explore network relationships like suggested friends on LinkedIn and Facebook. In all these applications, the data is not the “stock” in a data warehouse but a continuous flow. This represents a substantial change from the past, when data analysts performed multiple analyses to find meaning in a fixed supply of data.
Today, rather than looking at data to assess what occurred in the past, organizations need to think in terms of continuous flows and processes. “Streaming analytics allows you to process data during an event to improve the outcome,” notes Tom Deutsch, program director for big data technologies and applied analytics at IBM. This capability is becoming increasingly important in fields such as health care. At Toronto’s Hospital for Sick Children, for example, machine learning algorithms are able to discover patterns that anticipate infections in premature babies before they occur.
The increased volume and velocity of data in production settings means that organizations will need to develop continuous processes for gathering, analyzing and interpreting data. The insights from these efforts can be linked with production applications and processes to enable continuous processing. Although small “stocks” of data located in warehouses or data marts may continue to be useful for developing and refining the analytical models used on big data, once the models have been developed, they need to process continuing data streams quickly and accurately.
The behavior of credit card companies offers a good illustration of this dynamic. In the past, direct marketing groups at credit card companies created models to select the most likely customer prospects from a large data warehouse. The process of data extraction, preparation and analysis took weeks to prepare — and weeks more to execute. However, credit card companies, frustrated by their inability to act quickly, determined that there was a much faster way to meet most of their requirements. In fact, they were able to create a “ready-to-market” database and system that allows a marketer to analyze, select and issue offers in a single day. Through frequent iterations and monitoring of website and call-center activities, companies can make personalized offers in milliseconds, then optimize the offers over time by tracking responses.
Some big data environments, such as consumer sentiment analysis, are not designed for automating decisions but are better suited for real-time monitoring of the environment. Given the volume and velocity of big data, conventional, high-certitude approaches to decision-making are often not appropriate in such settings; by the time the organization has the information it needs to make a decision, new data is often available that renders the decision obsolete. In real-time monitoring contexts, organizations need to adopt a more continuous approach to analysis and decision-making based on a series of hunches and hypotheses. Social media analytics, for example, capture fast-breaking trends on customer sentiments about products, brands and companies. Although companies might be interested in knowing whether an hour’s or a day’s changes in online sentiment correlate with sales changes, by the time a traditional analysis is completed there would be a raft of new data to analyze. Therefore, in big data environments it’s important to analyze, decide and act quickly and often.
However, it isn’t enough to be able to monitor a continuing stream of information. You also have to be prepared to make decisions and take action. Organizations need to establish processes for determining when specific decisions and actions are necessary — when, for example, data values fall outside certain limits. This helps to determine decision stakeholders, decision processes and the criteria and timeframes for which decisions need to be made.
2. Relying on data scientists and product and process developers as opposed to data analysts
Although there has always been a need for analytical professionals to support the organization’s analytical capabilities, the requirements for support personnel are different with big data. Because interacting with the data itself — obtaining, extracting, manipulating and structuring it — is critical to any analysis, the people who work with big data need substantial and creative IT skills. They also need to be close to products and processes within organizations, which means they need to be organized differently than analytical staff were in the past.
Data scientists,” as these professionals are known, understand analytics, but they also are well versed in IT, often having advanced degrees in computer science, computational physics or biology- or network-oriented social sciences. Their upgraded data management skill set — including programming, mathematical and statistical skills, as well as business acumen and the ability to communicate effectively with decision-makers — goes well beyond what was necessary for data analysts in the past. This combination of skills, valuable as it is, is in very short supply.
As a result, some early adopters of big data are working to develop their own talent. EMC Corporation, for example, traditionally a provider of data storage technologies, acquired Greenplum, a big data technology company, in 2010 to expand its capabilities in data science and promptly started an educational offering for data scientists. Other companies are working with universities to train data scientists.
Early users of big data are also rethinking their organizational structures for data scientists. Traditionally, analytical professionals were often part of internal consulting organizations advising managers or executives on internal decisions. However, in some industries, such as online social networks, gaming and pharmaceuticals, data scientists are part of the product development organization, developing new products and product features. At Merck & Co. Inc, for example, data scientists (whom the company calls statistical genetics scientists) are members of the drug discovery and development organization.
3. Moving analytics from IT into core business and operational functions
Surging volumes of data require major improvements in database and analytics technologies. Capturing, filtering, storing and analyzing big data flows can swamp traditional networks, storage arrays and relational database platforms. Attempts to replicate and scale the existing technologies will not keep up with big data demands, and big data is changing the technology, skills and processes of the IT function.
The market has responded with a broad array of new products designed to deal with big data. They include open source platforms such as Hadoop, invented by Internet pioneers to support the massive scale of data they generate and manage. Hadoop allows organizations to load, store and query massive data sets on a large grid of inexpensive servers, as well as execute advanced analytics in parallel. Relational databases have also been transformed: New products have increased query performance by a factor of 1,000 and are capable of managing the wide variety of big data sources. Statistical analysis packages are similarly evolving to work with these new data platforms, data types and algorithms.
Another disruptive force is the delivery of big data capabilities through “the cloud.” Although not yet broadly adopted in large corporations, cloud-based computing is well suited to big data. Many big-data applications use external information that is not proprietary, such as social network modeling and sentiment analysis. Moreover, big data analytics are dependent on extensive storage capacity and processing power, requiring a flexible grid that can be reconfigured for different needs. Cloud-based service providers offer on-demand pricing with fast reconfiguration.
Another approach to managing big data is leaving the data where it is. So-called “virtual data marts” allow data scientists to share existing data without replicating it. EBay Inc., for example, used to have an enormous data replication problem, with between 20- and 50-fold versions of the same data scattered throughout its various data marts. Now, thanks to its virtual data marts, the company’s replication problem has been dramatically reduced. EBay has also established a “data hub” — an internal website to make it easier for managers and analysts to serve themselves and share data and analyses across the organization. In effect, eBay has built a social network around analytics and data.
Coming to terms with big data is prompting organizations to rethink their basic assumptions about the relationship between business and IT — and their respective roles. The traditional role of IT— automating business processes — imposes precise requirements, adherence to standards and controls on changes. Analytics has been more of an afterthought for monitoring processes and notifying management about the anomalies. Big data flips this approach on its head. A key tenet of big data is that the world and the data that describe it are constantly changing, and organizations that can recognize the changes and react quickly and intelligently will have the upper hand. Whereas the most vaunted business and IT capabilities used to be stability and scale, the new advantages are based on discovery and agility — the ability to mine existing and new data sources continuously for patterns, events and opportunities.
This requires a sea change in IT activity within organizations. As the volume of data explodes, organizations will need analytic tools that are reliable, robust and capable of being automated. At the same time, the analytics, algorithms and user interfaces they employ will need to facilitate interactions with the people who work with the tools. Successful IT organizations will train and recruit people with a new set of skills who can integrate these new analytic capabilities into their production environments.
A further way that big data disrupts the traditional roles of business and IT is that it presents discovery and analysis as the first order of business. Next-generation IT processes and systems need to be designed for insight, not just automation. Traditional IT architecture is accustomed to having applications (or services) as “black boxes” that perform tasks without exposing internal data and procedures. But big data environments must make sense of new data, and summary reporting is not enough. This means that IT applications need to measure and report transparently on a wide variety of dimensions, including customer interactions, product usage, service actions and other dynamic measures. As big data evolves, the architecture will develop into an information ecosystem: a network of internal and external services continuously sharing information, optimizing decisions, communicating results and generating new insights for businesses.
2.ANSWER
Following are the Benefits of Big Data
The concept of Big Data is nothing new. In fact, more and more companies, both large and small, are using big data and related analysis approaches as a way to gain more information to better support their company and serve their customers, benefitting from the advantages of big data.
3 Vs of Big Data :
Big Data is the combination of these three factors; High-volume, High-Velocity and High-Variety.
Volume
Big Data observes and tracks what happens from various sources which include business transactions, social media and information from machine-to-machine or sensor data. This creates large volumes of data.
Velocity
The data streams in high speed and must be dealt with timely. The processing of data that is, analysis of streamed data to produce near or real time results is also fast.
Variety
Data comes in all formats that may be structured, numeric in the traditional database or the unstructured text documents, video, audio, email, stock ticker data.
The importance of big data does not revolve around how much data a company has but how a company utilises the collected data. Every company uses data in its own way; the more efficiently a company uses its data, the more potential it has to grow. The company can take data from any source and analyse it to find answers which will enable:
Conclusion: Big Data-A Competitive Advantage for Businesses
The use of Big Data is becoming common these days by the companies to outperform their peers. In most industries, existing competitors and new entrants alike will use the strategies resulting from the analyzed data to compete, innovate and capture value.
Big Data helps the organizations to create new growth opportunities and entirely new categories of companies that can combine and analyze industry data. These companies have ample information about the products and services, buyers and suppliers, consumer preferences that can be captured and analyzed.
3.ANSWER
Some of the potential business benefits from implementing an effective big data analytics are mentioned below:
Big Data has the potential to help companies achieve real results by going through cycles of predictive modeling and data analysis. Information technology (IT) executives have attempted to figure out how they can use the four ‘V’s of big data – volume, variety, veracity and velocity.
Big Data Analytics leverages the four ‘V’s and delivers detailed insights for executing better decisions. Take an example of a marketer who can predict the customer registration pattern with the help of big data analytics. With detailed analytics, the decision maker can analyze when the customers have registered the most and also know which marketing campaigns has resulted in increased registrations.
Apart from marketing, there are many business functions that can add value by leveraging the power of Big Data Analytics for:
Thus, big data analytics helps organizations to take better and cost-effective decisions that accentuate the effectiveness of their business strategies and ultimately boost the bottom-line.
4.ANSWER
Organization can leverage Big Data in following ways:
“Data will talk to you if you’re willing to listen,”— Jim Bergeson.
Few can dispute that. However, the challenge comes when data transforms into bundles and stacks of unorganized and unstructured data sets. The challenge comes with listening to big data and making sense of what it says.
With big data, the conversing data becomes loud and noisy. You don’t hear the voice; you hear the cacophony. This is where organizations struggle.
And, amidst a struggle, you look up to the leaders to see how they are rising to the challenge. You observe you learn, you implement, and you adapt.
This is the first article of my “Under the Spotlight” series, where we will look at how leading organizations are leveraging big data and analytics, filtering out white noise from the discord in the process—to carefully follow and benefit from what data has to say.
These organizations are spaced among different industry verticals, including aerospace industry, sports industry, and life sciences industry, along with government agencies.
Airbus Leveraging Big Data and Analytics to Improve Customer Experience
Airbus has been a global leader in the aerospace industry for the last four decades, specializing in designing and manufacturing aerospace products, services, and solutions.
Operating in a complex and highly-competitive industry means that Airbus has to be at its best in terms of efficiency, productivity, and innovation to deliver an unmatched service experience to its customers. Big data and analytics are helping the company in that respect.
Using the IBM InfoSphere Data Explorer, Airbus integrates data discovery, navigation, analysis, and contextually-relevant view of more than 4TB of indexed data that is spread across different business units. All this data is then centrally accessible for people working in the service department, equipping them with valuable information to execute timely airline maintenance programs.
Leonard Lee, the vice president and head of new business models and services at Airbus Group, said in a recent interview, "We have tons of data. An aircraft is a very talkative machine. It produces petabytes of data. And today, in general, in the aerospace industry, only two percent of that data is used in any constructive way. So, we plan to leverage all the richness in that data to help improve our customer experience by driving initiatives like predictive maintenance. This way, our customers can get airplanes back in the air as quickly as possible.”
This one application of big data and analytics has accounted for savings of more than $36 million for the company in a single year.
Another way the company has been leveraging big data and analytics is to improve lead time in the production of aerospace units so that their customers can be facilitated with deliveries in due time.
Each shop floor has been empowered with digital solutions, which allow workers across different production units to update the status of a project in real-time. This data can then be communicated and shared between workers, positioned across different shop floors, reducing paperwork inspections, and inducing a proactive production approach. The newest Airbus rotorcraft, the H160 helicopter, has been built on this newly erected production model.
Lee further expanded on the company’s strategy, adding, “What we are trying to do with our digital transformation effort is to build digitally-enabled, data-driven business models. We are working with strategic partners like Palantir, and others, to capture more value across the value chain, by having layers of analytics, machine learning, and artificial intelligence, so that we can build solutions that would help us improve our customers’ experiences.”
NFL Teams Can Now Leverage Big Data and Analytics to Improve Performance Levels
In April, the NFL Players Association (NFLPA) entered into a partnership with WHOOP, a company that manufactures wearable devices. The objective of the alliance was to equip the athletes with a technology that could help them track their health and performance levels.
The WHOOP device can be strapped on an athlete’s forearm, wrist, or bicep, giving insight into his body while he trains or recovers.
Bioethicists Katrina Karkazis and Jennifer Fishman, commented on this innovative initiative, in an article, saying that if applied judiciously, responsibly, and ethically, biometric data technologies in professional sport have the potential to reduce injuries, improve performance, and extend athletes’ careers.
Isaiah J. Kacyvenski, a former American football linebacker, welcomed the idea, saying, “In the end, playing football is a job for us—athletes. As a football player, I always thought of my body as a business. The ability to create more value for the job you do should be acknowledged.”
NFLPA has announced that the data and the insights gained would be in the sole ownership of athletes, and they could use it or sell it in any way that they may want. The announcement further entailed the use of the device during a match as prohibitive.
Big Data and Analytics May Speed Up Finding Cure for Cancer
Business, sports, and even the life sciences industry is finding a use for big data. The life sciences industry is all about researching and expanding our understanding of the human body to keep it healthy and disease-free.
A human body is a complex system of cells, tissues, and organs, with various biological molecules forming the fabric of this complex system. This system is then regulated by sets of genes, which are present in our DNA.
To put these details and complexity into a quantitative context:
Expand on each of these details, and you will come across petabytes of data.
This shows the extensive amount of data, which life scientists have to manage and decipher on a regular basis. But, the industry is up for the challenge. It believes that big data and analytics can help to speed up the process of finding a cure for various diseases, even something as complicated as cancer.
“By leveraging big data and analytics, we can begin understanding the basic facts about how tumors grow, how heterogeneous tumors are and what are the targets, so we can create new drugs that work for particular tumors with particular genomic signatures,” said Robert Grossman, the principal investigator of the project Genomic Data Commons, in an interview with Chicago Inno.
What is Genomic Data Commons
The project, Genomic Data Commons, is about making cancer data available to researchers worldwide so that they can contribute to the findings and help speed up the search for cancer treatment. The data repository is housed at the University of Chicago and is one of the largest open access repositories in the world.
“On the research side, the majority of researchers in cancer, I think, find the amount of data frustrating,” said Grossman. “They want to use all available data but to set up an environment, to manage it, keep it secure and compliant—the process is just overwhelming. Our role is to bring together the large public research data sets to consistently analyze them and make it available in a digestible form to the research community to accelerate the pace of research.”
The project was launched a year ago, and the team believes that over the next six to nine months, they would be well resourced to make announcements regarding discoveries made through the use of GDC.
Government Agencies Leveraging Big Data and Analytics to Ensure Safety of Citizens
One of the primary roles of government agencies is to collaborate and communicate with each other to ensure the safety and wellbeing of citizens.
U.S. government agencies, both at the federal and state level, have always worked hard to make sure that they deliver on this responsibility. And now, they are leveraging big data and analytics to reinforce their efforts and strategy. Data and analytics is not a new domain for government executives. However, as the volume of data rises and budgets get strained, the challenge is to use big data and analytics solutions that give faster and more precise insights for agencies to respond proactively and quickly. An example of one such deployed solution is SPATIOWL by Fujitsu. The platform gathers traffic movement and transportation-related data that comes from sensors installed in urban areas. This data can then be used to identify accident hotspots—areas where there is increased passenger and vehicle movement—so that preventative measures can be taken in advance to mitigate accident risks.
Another example is the use of big data and analytics to anticipate natural disasters and improve disaster management activities. Government agencies are leveraging the use of technology to acquire high-resolution satellite imagery and seismic data. With the help of analysis offered by machine learning and artificial intelligence, this data is then combined with historical information to identify patterns and predict natural disasters. Moreover, platforms are being integrated with predictive disaster algorithms that allow government agencies to monitor in real-time the different delivery channels that make disaster management services accessible. As a result, the standards of delivered services are being improved.
conclusion
The world of big data and analytics is challenging but insightful. It provides actionable insights to help businesses and organizations automate their process, gain insight into their target market, and optimize their existing operations for improved productivity and efficiency.
But, only if one is willing to embrace its cacophonic nature. And, based on these examples, only a few can dispute that.
6.ANSWER
Aviation companies are utilizing Big Data
Big data is reshaping business. This is particularly the case in the airline industry, where on numerous occasions a big data case study has proven the industry is evolving.
However, many airlines are not taking full advantage of the data they have. With an enormous reservoir of data at their disposal, big data technology can transform the way airlines do business. By prioritizing data collection and analysis, even small airlines can respond to customer demands and market trends with precision and agility. So how are major airlines benefiting from big data? Here, we introduce how data can enhance airline operations and discuss five inspiring case studies.
The key benefits of leveraging big data analytics
Leveraging insights into big data can give airlines a huge advantage over competitors. From booking, check in, boarding, and even during the flight, airlines can learn a huge amount about their client base. Along with loyalty programs, airlines arguably generate more customer data than any other industry. Within this information lies enormous quantities of valuable intelligence that impacts operations, efficiency, and service.
1. Smarter maintenance
Big data helps airlines to better maintain their aircraft. Take fuel for example; fuel accounts for 17% of all airline operating costs, making it the most significant overhead after labor. Therefore, fuel efficiency is a critical metric. With big data, airlines can identify new efficiencies. Greater computational power has allowed airlines to gather and process huge volumes of data that enable them to analyze fuel consumption on a per-trip basis. For instance, Southwest Airlines collects data from sensors embedded in aircraft that measure wind speed, temperature, and plane weight alongside fuel consumption.
However, these advantages do not end at fuel efficiency. For example, Boeing analyzes 2 million conditions daily across 4,000 aircraft as a part of its Airplane Health Management system. This intelligence – which includes mechanical analysis, in-flight metrics, and shop findings – helps Boeing to plan maintenance and distribution. To illustrate, this system can predict failures and facilitate preemptive action. In practice, this approach saves the company $300,00 annually in service delays and repair costs.
2.Safer flights
By capturing flight incident data, regulators can improve safety across the aviation industry. Recently, the European Aviation Safety Agency launched the Data4Safety program, which collects and analyzes in-flight telemetry data, air traffic control information, and weather forecasts to detect risk. The program will allow regulators to determine safety risks and advise stakeholders. By combining big data analytics and computational power, this program aims to strengthen weak links in the aviation chain.
3. Improve service
While there are significant operational gains, big data can also help airlines to enhance customer service. Instead of simply identifying successful products, airlines can use big data to drill down into customers’ buying habits. By analyzing variables and aggregating historic information, airlines can predict and model customer behavior to generate personalized offers. This smart approach not only drives ticket sales, it also enhances opportunities for upselling, such as baggage fees and onboard refreshments.
Big data in aviation: 5 case studies
These scenarios demonstrate how airlines can leverage technology and data to improve operational performance. Now, big data is propelling airlines towards a new, more innovative future. Below, we detail five case studies that show how major players in the industry are using big data to the fullest advantage.
1. Encourage loyalty: United Airlines
Tailor-made offers will always appeal to the customer, thus encouraging loyalty. Airlines are in the fortunate position of being able to learn an enormous amount about their client base from data. Even a single booking contains data which can teach an airline a huge amount about its customers. For instance, United Airlines use their “collect, detect, act” protocol to analyze over 150 variables in each customer profile. These analyses measure everything from previous purchases to customer preferences in order to generate a tailor-made offer. The collect, detect, act initiative has increased United’s year-to-year revenue by over 15%.
2. Get to know the customer: British Airways
British Airways uses an intelligent ‘Know Me’ feature to provide personalized search results to customers. In this impressive big data case study, BA identified that their customer base largely consists of busy, time-pressed professionals who require fast, concise results. Therefore, ‘Know Me’ uses in-depth data analysis to provide relevant and targeted offers for their consideration. BA received a huge amount of positive feedback from clients who loved the fact that the company understood their travel needs.
3. Deploy artificial intelligence: EasyJet
Many airlines go a step further than basic data collection. With new technology, it is possible for companies to analyze big data accumulated from purchase activity to demand patterns. For instance, if an airline sees the demand for a certain route increasing, they can adjust prices accordingly. From this information, the airline can also identify which customer segments are price sensitive, and determine a segment’s price range for a given route.
A related big data case study comes from EasyJet. EasyJet invested in an artificially intelligent algorithm that determines seat pricing automatically, depending on demand. Furthermore, the system can also analyze historical data to predict demand patterns up to a year in advance. These analytics can also impact future decision-making about new routes, schedules, and codeshare alliances.
4. In-flight intelligence: Southwest Airlines
Inflight, vast amounts of data are generated throughout the journey – pilot reports, warning reports, control positions, and air traffic control communications. When this data is closely monitored and analyzed it can streamline operations and improve safety. For example, Southwest Airlines have teamed up with NASA to continually improve airline safety. By using intelligent algorithms, Southwest and NASA have created an automated system that can crunch an enormous amount of data to flag anomalies and prevent accidents.
5. Making lost bags a thing of the past: Delta
American airline Delta has developed an app which allows customers to track their bags on their smartphones. The concept is simple – the app uses exactly the same technology that the Delta ground staff use. So far, the app has been downloaded over 11 million times by Delta customers globally.
OEMs are transforming the aviation industries in following manners
Living travel experience
Travellers will experience seamless journeys tailored to their habits and preferences. Companies along the Aviation, Travel and Tourism industry journey will optimize customer experience by collecting and exchanging data, and continuously generating insights. In time, travel will become frictionless, blending seamlessly with other everyday activities.
Enabling the travel ecosystem
Ecosystem roles are blurring as stakeholders throughout the customer journey vie to own the customer relationship. Digital platforms that enable ecosystem alliances will continue to emerge, as asset and information sharing become increasingly important from a B2B perspective.
Digital enterprise
Digital technologies that revolutionize manufacturing, optimize the real-time use of assets and eventually augment the industry workforce will transform operations. Innovations such as 3D printing, AI, the Internet of Things (IoT), virtual reality (VR) and digital platforms will enable flexible working and changes to core operational processes.
Safety and security
As identity management becomes increasingly digital, a collaborative effort towards boosting cybersecurity and protecting the privacy of traveller data will be crucial to maintaining customer trust and public safety. Digital technologies (e.g. biometrics such as facial recognition, IoT, crowd analytics and video monitoring via AI) will be used to create a ubiquitously secure environment.
Key points to consider
Maximizing the value of digitalization in aviation, travel and tourism will require concerted action from industry leaders, regulators and policy-makers. A series of actions for ecosystem participants looking to make digital transformation a success has been have been identified:
i.>Transform legacy systems into agile interoperable platforms, to enable plug-and-play interactions between partners in the ecosystem.
ii.>Support the transition of the workforce by reskilling current employees through training, and empower educational institutions to design curricula that prepare the next generation for the digital economy.
iii.>Develop a multistakeholder approach – involving private, public and civil-society organizations – to deliver regulatory frameworks that define the appropriate uses of data.