Big Data at Sears
When it comes to the adoption of information technology, Sears was years ahead of most retailers, implementing an enterprise data warehouse in the 1980s while most retailers were still relying on manually-updated spreadsheets to examine their sales numbers. These days the company is using big data technologies to accelerate the integration of petabytes of customer, product, sales, and campaign data in order to understand increase marketing returns and bring customers back into its stores. The retailer uses Hadoop to not only store but process data transformations and integrate heterogeneous data more quickly and efficiently than ever. “We’re investing in real-time data acquisition as it happens,” says Oliver Ratzesberger, Vice President of Information Analytics and Innovation at Sears Holdings. “No more ETL. Big data technologies make it easy to eliminate sources of latency that have built up over a period of time.” The company is now leveraging open source projects Apache Kafka and Storm to enable realtime processing. “Our goal is to be able to measure what’s just happened.” The company’s CTO, Phil Shelley, has cited big data’s capability to decrease the release of a set of complex marketing campaigns from eight weeks to one week—and the improvements are still being realized. Faster and more targeted campaigns are just the tip of the iceberg for the retailer, which recently launched a subsidiary, MetaScale, to provide non-retailers with big data services in the cloud. Source: Big Data in Big Companies, Thomas H. Davenport and Jill Dyché, May 2013 (Go to Suggested Readings to view full article)
0 Comments
Big Data at Macys.com
Macys.com is considered the equivalent of a single store at the giant retailer’s structure, but it’s growing at a 50% annual rate—faster than any other part of the business. The division’s management is very oriented to and knowledgeable about IT, data, and analytical decisions. Like other online retailers, Macys.com is heavily focused on customer-oriented analytical applications involving personalization, ad and email targeting, and search engine optimization. Within the Macys.com analytics organization, the “Customer Insights” group addresses these issues, but it also has a “Business Insights” group (focused primarily on supporting and measuring activity around the marketing calendar) and a “Data Science” organization. The latter addresses more leading-edge quantitative techniques involving data mining, marketing, and experimental design. Macys.com utilizes a variety of leading-edge technologies for big data, most of which are not used elsewhere within the company. They include open-source tools like Hadoop, R, and Impala, as well as purchased software such as SAS, IBM DB2, Vertica, and Tableau. Analytical initiatives are increasingly a blend of traditional data management and analytics technologies, and emerging big data tools. The analytics group employs a combination of machine learning approaches and traditional hypothesis-based statistics. Kerem Tomak, who heads the analytics organization at Macys.com, argues that it’s important not to pursue big data technology for its own sake. “We are very ROI-driven, and we only invest in a technology if it solves a business problem for us,” he noted. Over time there will be increasing integration between Macys.com and the rest of Macy’s systems and data on customers, since Tomak and his colleagues believe that an omnichannel approach to customer relationships is the right direction for the future. Source: Big Data in Big Companies, Thomas H. Davenport and Jill Dyché, May 2013 (Go to Suggested Readings to view full article) Price-Prediction Engine
In 2011, a startup company in Seattle known as Decide.com initiated a bold ambition of building a price-prediction engine for zillions of consumer products, but mostly technology gadget. The company deployed powerful computer to obtain data feeds from e-commerce sites and other price and product information that could be ofund on the Internet. Not only the company had to collect the ever changing pricing data at all times, it also collect texts to analyse words to identify when a product was being discontinued or a newer model was due to be launched, which is crucial in determining prices of outgoing and incoming products at this crossroad of the product life cycle. Over a one year period, Decide.com was analysing 4 million products using over 25 million price observations. This observation revealed oddities in the retail world that people had never been able to see before - prices might temporarily increase for older models once new models are introduced. Most people would purchase the outgoing models believing that the price would drop for the incoming models to make their way to the consumers, thus ended up paying more for outgoing models! Decide.com can spot unusual price spikes and warn consumers to wait. The company achieves an accuracy rate of 77% and provide an average potential savings of $100 per product. It is so confident with its prediction that it will reimburse the price difference to paying members of the service. The huge success of Decide.com get noticed and the company has been acquired by eBay. The following announcement is taken from the website of Decide.com: Decide community and friends, As you may have heard, Decide was acquired by eBay, the worldwide commerce leader. Since we started three years ago, weve been on a mission to bring you a more transparent shopping experience and help you feel more confident in your purchase decisions - in fact, we've been downright obsessed with it. Over the past year, we began experimenting with eBay and discovered an exciting opportunity to apply our team's strength in data and predictive analytics to help over 25 million eBay sellers make smarter decisions and be more successful. We want to thank you - our loyal users, members, media partners and investors. You supported and trusted us and are the reason we have this incredible new opportunity ahead of us. We will continue to bring you the same innovative products and thinking as part of the eBay family. Thank you for being a part of our journey! Mike Fridgen CEO Decide Using Big Data in Product Design
In a Southeast Asian nation where 3 main ethnics are Malay, Chinese and Indian, one cable television operator designed ethnic viewing package based solely on demographic: Malay package for Malay families, Chinese package for Chinese families, and Indian package for Indian families. This was considered safe and avoided selling inappropriate or sensitive content to the wrong target audiences. However, this was not necessarily true when Big Data was adopted to analyse the viewers behaviours. This was done by firstly extracting the languages watched (Small Data) from the viewing logs (Big Data). It was then discovered that some Malay or Indian families were viewing Chinese content and vice versa. At first, the management did not think that the findings were right as it was not possible to sell Chinese content to Malay or Indian families. The management suspected that the demographic data was wrong. The management speculated that it could be a family of mixed marriage where the child was learning Chinese by tuning into Chinese content, or the maid was watching or any other reasons. The study did not know why they watched the content of in another language, but it conclusively found that they watched content in multiple languages. Based on the outcome of the study, the management decided to try cross-ethnic viewing packages and the result was a huge increase in sale and revenue. This illustrates the use of Big Data in helping a company make decision in product design. This case study also highlights some important shifts in Big Data, especially Shift 3: Shift 3 When dealing with Big Data, we move away from age-old seach for causality - a human nature to search for causes. Instead of looking for causes, we are satisfied with patterns and correlations in the data that provide us with unprecedented insights. The correlations may not tell us precisely why something is happening, but they affirm that it is happening. Free Public Wi-Fi Networks at Shopping Centres
A technology solution company in Australia, through its installations and management of free public Wi-Fi networks for shopping centres and retailers, is putting Big Data into the palms of retailers. Through a connection to the consumer’s mobile phone, consumer data can be captured, tracked and distilled for the retailer – allowing them to make data driven decisions, and ultimately maximise their profits. Providing free Wi-Fi appears to be providing convenience and enhancing shopping experience. At the back of it, the technology is actually collecting information of each registered customer to design targeted advertising which the retailers can push to customers, based on shopping habits and purchasing history. The technology is tracking what the consumer is doing with their smartphones that are now on the free Wi-Fi network, following the consumers around the shopping centre (creating a heat map to see how much time consumers spend at each location), current and past shopping history. This is done through 4 steps: 1. Connect - where customers are first registered (once only to use in all shopping centres operated by the same company) to use the free Wi-Fi network; 2. Understand - where customer habits and behaviours are monitored, online and offline, in store and out of store; 3. Engage - where targeted promotions are delivered to the smartphones of customers, which can take the form of SMS, email, social media or downloaded apps; and 4. Retain - where continued effort to follow up with loyalty, points, rewards and redemption is implemented to create repeat customers. Many of the major Australian retail groups including GPT Group, Brookfield, and Ipoh Garden – which owns Sydney’s QVB, Galeries, Strand Arcade and Chifley Plaza, are already implementing this solution exploiting the Big Data to drive sales. Hotel Chain Uses Big Data to Increase Bookings The average flight cancellation due to bad weather is about 1%-3% of total daily flights. This is equivalent to 150 to 500 cancelled flights or around 25,000 to 90,000 stranded passengers every day. These stranded passengers need a place to stay overnight. Red Roof Inn recognised the need to target these stranded passengers to offer overnight hotel accommodation. It also recognised that many travelers use smartphones to look for emergency accommodations. Therefore, Red Roof Inn implemented its Big Data strategy by sourcing to free public information - weather and flight cancellation, and organised information about airport and hotel locations, and then built an algorithm that factored in weather severity, travel conditions, flight cancellation rates, time of day, and other variables, and combined with using geo-based mobile marketing campaigns to send targeted advertisements to smartphones of stranded passengers about a nearby Red Roof Inn. The payback from Big Data is convincing, as Red Roof Inn subsequently enjoyed a 10% bookings increase from 2013 to 2014. Pizza Chain Makes More Sales in Time of Power Outage
Using the same methodology of Red Roof Inn, a pizza chain uses a mobile app and mobile marketing campaigns to deliver discount coupons to area experiencing power outage leaving residents unable to cook. This is made possible with power suppliers often announce power outage on their website, combined with other consumer-related data. This mobile and location-based marketing campaigns achieves a 20% response rate. Big Data Initiatives by Country > Cambodia
Open Development Cambodia Open Development Cambodia (ODC) is an ‘open data’ website, the first of its kind in Southeast Asia. The open data movement is based on the simple premise that data collected for public interest should be publicly available without restrictions. Information or data in the public domain should be freely available to everyone to use and republish as they wish. Big Data is Transforming Commercial Construction
Construction is a costly and time-consuming process. There are several layers to every project, and average profit margins are relatively narrow. That’s why any and every money-saving measure counts. It’s nearly impossible to flawlessly budget, manage, and organize a construction project. From employees to suppliers and logistics, there’s one tool that can help the multifaceted building process: big data. Data has always been vital to construction, but now tools exist that enable managers to really gather and make use of it in a more streamlined and efficient manner. The concept of Building Information Modeling or BIM involves using 3D virtual models to help a team better plan, design, construct, and manage building structures. BIM has been around for decades, and now many are calling for the integration of big data into the process. By adding data, these programs could also allow designers to more easily spot trends or make predictions on a project. DATA HELPS TRACK AND MANAGE ASSETS There are plenty of assets a construction company must look after. Moving tools to and from, or around construction sites is no simple business. Tools and goods can come from several different sources including far-off factories or other construction sites. Data isn’t particularly helpful in keeping track of every little nail, but it can help managers see which tools are where. If a manager can see the location and details of these assets, they can more effectively make decisions about how to utilize them. Data can reveal how quickly each asset can be moved to a new location. It can also reveal which tools are being used, when, and how. Perhaps a vehicle sits unused for only a single day or a wrong decision wastes a few hours moving one tool to another location. These sound like rather small details, but each small decision can play a big role in the overall health of a project Cutting down on waste at every step is key to making profit. Small changes can and do make a difference to the bottom line. Data also allows for better onsite organization. Behemoth-sized, slow-moving vehicles are hard to maneuver around a tight construction site. Sensor-equipped assets can be tracked and optimized, minimizing wasted time, money, and resources. Sensors can also gather valuable data for analysis. To this end, data proponents are praising analytics’ ability to give consistent and easy-to-understand status updates to teams. Regular updates allows for decisions and steps to be evaluated more regularly. Temperature, humidity, and stress can also be analyzed to determine how it a particular building performing and to alert the team to any sudden changes. USING DATA TO MORE ACCURATELY BUDGET AND PLAN Before reaching the building site, data can tell managers what to expect. Collecting and sharing accurate, useful information between the several parties involved in a project is no easy task. From the very beginning, data can help suppliers, builders, and managers all have a better idea what a project will require. This means they can make better predictions and more accurately evaluate a budget or timeline. Better data visualization and simulations also facilitate communication between architects, engineers, and construction workers. Instead of constant back and forth over minor changes, data analytics can show all parties exactly what a change will mean. Weather, traffic, and other external data can be taken into consideration. Understanding the roads and conditions can be translated into optimally planning the many phases of a project. Identifying crowded roads or times as well as when to expect poor weather conditions means fewer surprises and hiccups. Over time, the applications of data become even greater. For commercial companies, this also means the ability to better evaluate subcontractors. Each step of the process, each contractor and location, can be evaluated. Data analysis can be used to see what went better, or worse, on any given project. Proper data evaluation will reveal just how reliable a potential partner might be. As companies work with not just one but several other companies, the ability to analyze the risks associated with each potential partner. Removing a less efficient partner from the mix could have far reaching results on any project and drastically cut wasted resources. Big data also offers another opportunity: simulation. Predicting outcomes is a major component of data analysis, and using the right data to simulate a project could yield unprecedented results. The many layers and locations of a construction project interact in countless ways, and small changes can cause problems in unexpected ways. Thorough simulations could enable managers to pinpoint problem areas before a situation occurs. Recognizing the real world limitations of a project might not always be possible, but data can at least lessen the likelihood of problems arising. MOVING FORWARD WITH DATA One reason big data has not yet been fully integrated into commercial construction is because of its complexity. Adding a team of dedicated data scientists to every project isn’t always possible nor is it particularly cheap. The cost of big data systems and highly skilled analysts would likely undermine any chance of data becoming thoroughly integrated into the field in the near future. This is why more usable softwares, and software-as-a-service is becoming more common. Outsourcing this work to an analytics company can completely shift the way a company views their construction process without breaking the bank. This is, however, only possible when companies are ready to fully take on a data-driven process. Picking-and-choosing applications or only viewing data passively could have little effect on the overall results, or even cause more difficulties. Companies must be ready to work around data and have clear goals in mind before setting out. Big data is already being used by several companies, but it has not yet became a poster child for the industry. The introduction of industry-specific platforms is still underway and far from fully developed. As more firms are turning to real-time, cloud-powered analytics, that’s likely to change. Source : http://dataconomy.com/big-data-is-transforming-commercial-construction/ BIG DATA'S BIG GUNS: ALLSTATE
Like some of its peers, Allstate has seen itself as a big data company for a while, according to chief data officer Floyd Yager. But the impetus to do more with the data it had collected over the years has never been greater. “Allstate has always had very good data, but it does exactly what it was designed to do: make us transactionally efficient on a product basis,” Yager says. “Now it’s about, ‘How do I take that operationally efficient data and turn it into a customer/household view and understand all the products attached to a person?’” Allstate has focused heavily on master data management and data governance over Yager’s tenure, creating party and household IDs for data. The company is also building a team to work across business areas on analytics projects rather than siloing big data projects within certain units. “A lot of what we talk about is having our data more integrated so it can be shared across departments,” Yager says. “Something meant for a single purpose often leads to other insights. We know, for example based on some call-volume analysis in our call center, how often customers defect.” Leading the data organization requires Yager to envision future opportunities, and work with his counterparts in IT and beyond to find the right way to leverage data that in the past had never been accessible. “The sheer amount of data under management continues to increase every year,” he says. “We have an application in claims, QuickFoto, where a policyholder that isn’t in a major accident can snap a picture of the damage and send it to us. But whereas in the past, that would’ve gone into a physical folder and then a filing cabinet, now I have all those pictures of cars in a database, and there’s a lot more that I can do.” That’s really the Holy Grail for big insurers like Allstate, Yager says. After all, they’ve always been collecting a large amount of data – but now, with the right strategy, more insights can be unlocked, making life easier for associates across the value chain. “We want it to be well-governed, high-quality, so we can make better insights about our customers,” he says. “We want to use it to better understand who that person is, understand their risk profile, and position our agents to be trusted advisors in helping them meet their insurance needs.” Source: Insurance Networking News, May 2015 BIG DATA'S BIG GUNS: SWISS RE
Big data is one of Swiss Re’s global strategic initiatives. The insurer is using more public data to improve underwriting results and decrease the number of questions the insurer has to ask consumers to underwrite them. Riccardo Baron, big data and smart analytics lead for Americas at Swiss Re says currently available data opportunities were inconceivable only few years back. “We are at a point where it is clear that the question is no longer ‘whether’ big data but rather ‘how’ big data should become a reality,” he says. In that context, Swiss Re is looking at big data in terms of two major streams. In the first, big data is being used to help reduce costs and improve the efficiency of current processes throughout the insurance value chain, including claims and fraud management, cyber risk, customer management, pricing, risk assessment and selection, distribution and service management, product innovation, and research and development. “My role is within our excellence task force, which works at a group level to analyze massive volumes of data, utilize the latest in analytics techniques and set the basis for developing innovative, new solutions for our clients,” Baron says. In the second stream, big data also offers a new framework to think bigger in terms of market disruption. “This has to do with innovation, and how we believe newcomers in the insurance industry could potentially disrupt traditional business models,” Baron explains. Baron says Swiss Re has created more than 100 prototypes internally, and that as a result the entire organization sees the value and importance of big data and smart analytics. “We are acquiring new data science skills and learning is becoming focused on individual market needs,” he says. Big data talent development and acquisition also is a key concern, Baron says. “We opted for this internally driven model largely because we aim at catalyzing a mindset change in our company. We want Swiss Re to change with big data, and for our clients to benefit from new solutions in the future.” Source: Insurance Networking News, May 2015 |
AuthorWe are writing to share what we read about Big Data and related subjects with readers from around the world. Archives
September 2019
Categories |