The “big” in Big Data refers to three factors: (1) quantity of transactions, (2) speed of required, and (3) complexity of analysis.
We have seen how Big Data provides a golden opportunity for Big Optimization.
- The asset management company is able to track, manage, and forecast every rail car in motion at every location for every railroad in North America. Quantity: many cars, many locations. Speed: constant updates as cars are in motion. Complexity: many different car types, facilities, geographies, and cost battling with Noisy Data.
- The airline runs one of the three largest air fleets in the world. It serves the most number of airports of these three airlines by a factor of 10. Quantity: many planes, crews, airports. Speed: need for rapid real-time responses to pilot call-ins as weather and other conditions change. Complexity: extremely high-cost vehicle assets (jets), combined with highly regulated people assets (unionized pilots), combined with varied facilities capacities and rules (airports).
- The high frequency hedge fund uses optimization to provide low-risk trading in a quintessential Big Data environment . Quantity: thousands of securities, thousands of transactions per second. Speed: real-time, sub second solutions required. Complexity: multiple trading venues, all with different rules.
- For the e-commerce trailblazer, choosing appropriate box configurations for millions of customers creates a problem of enormous complexity, requiring advanced modeling and algorithmic techniques to achieve reasonable solution times.
- For the 2020 U.S. Census, on each day over the course of six weeks, tens of millions of open cases need to be efficiently assigned to enumerators so that the likelihood of a successful visit is maximized while travel time and mileage are minimized.