Data Rich and Information Poor (DRIP): The Adversary of Lethality

Story Stream
recent articles

Earlier this year, The Strategy Bridge asked university and professional military education students to participate in our first annual writing contest by sending us their thoughts on strategy.

Now, we are pleased to present an selected for an Honorable Mention submitted by Geoff Weber of the U.S. Navy’s Legislative Affairs Fellows Program.

The phrase data rich and information poor (DRIP) was first used in the 1983 best-selling business book, In Search of Excellence, to describe organizations rich in data, but lacking the processes to produce meaningful information and create a competitive advantage. DRIP was defeated in the private sector with wise implementation of information technology. However, government institutions have lacked the incentives to attack the disease and have instead treated the symptoms of the dilemma armed with a gross misunderstanding of the cognitive domain. In the U.S. Department of Defense, a deluge of data overwhelms analysts and provides little information that is timely and relevant to decision makers. At the speed of modern war, DRIP is a labyrinth of inaccessible and often un-authoritative stovepipes of data that retard information fusion, impede shared understanding, and encumber lethality. To defeat DRIP and build a more lethal force, the Secretary of Defense must clearly articulate this problem, prioritize strategic solutions, and seek the support of the U.S. Congress in authorizing and appropriating agile information related capabilities overmatching any organization seeking to do harm to America.

Lethality necessitates that the proper weapon (what) be delivered to the precise location (where) at the correct time (when). The information requirements for success are relatively simple, but they become considerably more complex in overmatching competition and maximizing lethality. What is the most effective sequence of weapons against the enemy? How do these weapons deliver the desired effect? How will the enemy and others respond to this effect? Recognizing that knowledge of the enemy would overwhelmingly multiply lethality, Sun Tzu concluded, “When you are ignorant of the enemy and know yourself, your chances of winning or losing are equal” but if you “know the enemy and know a hundred battles you will never be in peril.”[1]


Questions to support lethality require answers at the speed of relevance if decision making left-of-boom is to be effective. General Joseph Dunford, the Chairman of the Joint Chiefs of Staff, has said, “The speed of war has changed, and the nature of these changes makes the global security environment even more unpredictable, dangerous and unforgiving. Decision space has collapsed and so our processes must adapt to keep pace with the speed of war.”[2] So, how do we ensure the Secretary of Defense gets knowledge at the speed of war to make decisions? All too often, answers are channeled up to decision makers through a sequential, slow, and often highly editorialized request-for-information process. With a request for information, one military unit requires information to which they do not have access from other units that might possess this data. They draft a request for that information and send it to a multitude of units. This arcane process further requires humans in the loop to laboriously recognize the request, review their information holdings, and respond, even when there is little incentive to do so. Often, positive responses take days because humans cannot operate at the speed and efficiency of digital systems.

Fortunately, information can be represented by digital bits that travel through fiber optic cables at the speed of light. Writing in Business @ the Speed of Thought, Bill Gates offered the world many examples of how good information systems can dramatically speed up production. He proposed a critical distinction between bit-oriented information processes on the one hand and atom-oriented efforts constrained by the physical environment on the other. According to Gates, the most successful organizations will reduce all information to highly structured bits of data, write algorithms to manipulate this data according to organizational objectives, and then seamlessly connect analysts and decision makers to this information to outpace their competition. Gates implored industry leaders to study all their information processes and integrate them into a digital nervous system, significantly reducing the time required for any bit-oriented efforts.[3]

Understanding the power of good information, many leaders in defense, industry, and the U.S. Congress have argued for increased government spending to support information fusion centers, big-data analytics, machine learning, artificial intelligence, and cloud computing. These are all noble efforts that partially contribute to information dominance, but are considerably less effective if a more fundamental problem for the Department of Defense—that it is data rich and information poor—is not defeated first. It seems proponents of these initiatives see the power of shiny solutions without fully understanding the problem of the Department continuing on a path of garbage in, garbage out. Appropriating additional funding to the proposed solution does not fix the problem when the underlying data is garbage, or the data is stored in thousands of garbage dumps as inaccessible stovepipes of information.


Building Shared Understanding (Joint Publication 3.0: Joint Operations)

To fully understand and correctly address the problem of being data rich and information poor, it is first necessary to recall what the cognitive domain has to say about data and information. As represented above, data must first be processed to create usable information that supports learning and ultimately decision making. Data is often represented by points that are seemingly meaningless until processed. Examples include locations, temperatures, or specific times. Processing these data points together by synthesizing matched attributes yields information about the temperature of a particular location at a specific time. Applying individual learning to this information would provide a basic level of knowledge that could be useful to a decision maker. Information becomes considerably more valuable when multiple records are fused together to build even broader knowledge. However, correctly moving up the cognitive domain from factual premises to inductive conclusions becomes a challenge when data cannot be accessed or points are not correctly structured.[4]

Data Challenges Associated with Information Fusion to Produce Knowledge (Author’s Work)

The figure above provides a simplified example to demonstrate the method an analyst employs to fuse information and how the resulting knowledge product fails to be accurate when data is not structured appropriately, or access to information is limited. In the example, temperature data from an information record in the year 2017 fuses with that from a 2013 record to produce a conclusion, or knowledge product, that weather on inauguration day is cold, but always above freezing. The underlying analysis was sound and included an astute understanding that inauguration day occurs on 20 January every four years, and if that date falls on a Sunday then inauguration day in 21 January. However, the resulting knowledge is flawed because the analyst used only one database that contained only two information records. Further, the analyst was unable to query records from a different database containing additional information, because its data wasn’t structured to a common standard or possibly wasn’t available on the same information system.

Indeed, weather on inauguration day in Washington, D.C., is not always above freezing, as the 1985 information record clearly illustrates. However, data from congressional records uses different data structures for date and field naming conventions for location that present a challenge to an analyst even accessing the required information. Further, even if the records were accessed, the data to be fused has a different field name (weather), type of data (text), and associated units (Fahrenheit). This presents a further challenge for a machine algorithm to rapidly overcome. In this simplified example, the obstacle is not too large for a human analyst to understand. However, in the complex and rapidly evolving environment where Department of Defense analysts and decision makers operate, the large proportion of partially usable data impedes the creation of accurate and meaningful information. In short, the Department of Defense is data rich and information poor—DRIP.


In 2017, Lieutenant General Jack Shanahan at the Director for Defense Intelligence reported the Department of Defense collects 22 terabytes of data every day. “You cannot exploit 22 terabytes worth of data the way we are doing things today,” Shanahan said. But Eric Schmidt of Google, a member of the Defense Innovation Board (DIB) commented on the 22 terabyte figure and noted that “within the business world, this is not overwhelming. Those kinds of numbers are easily dealt with, with modern computing. So there is an example of a big gap between the commercial and defense worlds.”[5] This data difference is how it is structured and where it is stored.

Acknowledging the data rich and information poor situation faced by the U.S. government is not just the topic of recent deliberations. In 2001, Jack Sheehan of the Defense Modeling & Simulation Office released a report calling for the use of authoritative data sources. Sheehan clearly identified a time consuming problem of “many users going to many data sources” requiring many months and many people simply to find relevant information within the Department of Defense.[6] The intelligence community was no better and was chastised for the same problem in the 9/11 Commission Report. For nearly two decades, the Department of Defense and the intelligence community have lamented their lack of authoritative data source and have made an art form of admiring the problem.

Knowledge of the enemy is critical to support lethality, but so too is knowledge of friendly force capabilities. Sun Tzu cautioned, “If ignorant both of your enemy and of yourself, you are certain in every battle to be in peril.”[7] Indeed, knowledge of the enemy is difficult to obtain, but the intelligence community is slowly defeating the phenomenon of the data rich and information poor while simultaneously protecting sources and methods. However, collecting and processing data on friendly forces is surprisingly no better. Consider how little the Department of Defense understands about its most precious resource, its people and the capabilities they bring to waging war. Because there is no central database to understand the foreign language capacity of the 1.4 million personnel currently serving on active duty, a Government Accountability Office report stated, “The Army and Marine Corps do not have the information they need to effectively leverage the language and culture knowledge and skills of these forces when making individual assignments and assessing future operational needs.”[8] The U.S. military did a better job deploying French and German speakers to Europe during World War II, without any digital information systems, than the Department of Defense manages its people today with quality information systems.


The U.S. Department of Defense has not institutionalized information dominance the way the private sector has. Worse, adversaries understand U.S. bureaucracy perpetuates its data rich and information poor condition, and have streamlined their own organizations to be agile dominators of information. They further recognize properly designed information systems can provide a speedy decision-making advantage over the U.S. and have therefore focused their efforts for speed. Lieutenant General S.A. Bogdanov, a Russian Doctor of Military Sciences, observes, for example:

“A unified information and communication system is to be deployed to link decision makers and the people carrying out their decisions in order to quickly deliver essential information about the situation fast to all participants in military operations on the ‘many-to-many’ principle.”[9]

U.S. military forces are often cited as the friendly center of gravity in many combat scenarios by defense experts. These forces frequently cite intelligence, one of the six operational functions, as a critical capability that affords Clausewitz’s “source of power that provides moral or physical strength, freedom of action, or will to act.”[10] Information systems are a critical requirement for good intelligence and must provide information to decision makers which is both timely and relevant. However, data rich and information poor practices greatly impede the completeness, speed, and accuracy of this information. This “aspect of a critical requirement which is deficient” makes information systems a Department if Defense critical vulnerability that could lead to catastrophic effects.[11]

This critical vulnerability is an unintended result of little unity of effort in the information domain. With many military or intelligence units doing their own thing, in a manner of speaking, there are relatively few data standards to support information dominance. This represents a sharp departure from the earliest days of employing information systems in the Department of Defense and the creation of the Internet by the Defense Advanced Research Projects Agency (DARPA). DARPA created the first data standards such as Transmission Control Protocol to accurately communicate information between nodes in a network. The remainder of the digital world had adopted these standards and continued systematically developing new information-related capabilities that have far surpassed the ability of the Department of Defense to process its own data.


LtGen Michael Peterson (USAF Photo)

The majority of data in the Department of Defense resides in a fragmented assortment of unstructured and un-authoritative portals making it improbable for analysts to rapidly fuse into timely knowledge of the operational environment. Lieutenant General Michael Peterson, the Air Force Chief of Warfighting Integration and Chief Information Officer, lamented this precarious situation in 2008: “Finding the authoritative data becomes time-consuming and difficult for intelligence analysts because the data are stored in multiple locations.” He further stated that “two-thirds of the time required to prosecute a time-sensitive target is allocated to manual communication processes—not machine to machine, not automated, but rather someone making a voice call, writing something down, or manually entering data.”[12]

Speaking in March 2017 as the head of the Strategic Capabilities Office (SCO), William Roper said the Department of Defense focuses “on data in a 1990s-era way—data for us is like something that you use to go into the fight and win, and after that fight, the purpose of the data, its raison d’etre, is over.” Roper continued, “That is not the way in the commercial world—trying to work analytics—to them, that data is truly gold.” Roper concluded, “It’s a commodity, it’s a wealth, it’s also a fuel, and your data keeps working for you even after you’ve used it.”[13] Even old data is of value to understanding the past, and the Department of Defense is doomed to repeat bad practices if it cannot defeat the disease of being data rich and information poor.

Now, the challenge within the Department is to process a multiplicity of unstructured data from thousands of Air Force, Army, Marine Corps, and Navy (and other) units over six geographic and four functional Combatant Commands supported by a variety of defense agencies. Lacking a common data standard, and very few recognized authoritative data sources, it is cumbersome to produce algorithms to translate and process each form of data into meaningful information, let alone analyze these disparate information forms to create a timely knowledge product of relevance to the decision maker.

President John F. Kennedy: "We choose to go to the Moon! We choose to go to the Moon in this decade and do the other things, not because they are easy, but because they are hard..." (NASA/Wikimedia)

Defeating DRIP will require a moon-shot approach to information dominance, much like that taken in President John F. Kennedy’s 1961 speech providing the vision to put an American on the moon in 1969. There has been plenty of talk regarding information dominance in the Department of Defense, but outside of the increasing metric of dollars spent, the talk does not offer a solution that addresses the root cause of the problem. Like a tenet of combat casualty care, it would be wise to stop the bleeding, as it were, and prevent data from going to multiple un-authoritative databases. Instead, real information dominance requires mandated authoritative data standard so all participants are looking at information derived from the same data.

The Department of Defense began collecting digital data long before it had repositories where the data could be most useful. Fortunately, it is not yet too late to start fixing the problem. The Defense Innovation Unit Experimental (now the Defense Innovation Unit), the Defense Digital Service, and the Strategic Capabilities Office, all members of the Department’s Cloud Executive Steering Group (CESG), are a part of this solution. The Steering Group was formed in 2017 with the intent of making commercial cloud services available to the Department iof Defense. One initiative, the Joint Enterprise Defense Infrastructure (JEDI) would bring a variety of Departmental data onto a single platform to best serve the warfighter, while also facilitating machine learning and artificial intelligence. Despite the Joint Enterprise Defense Infrastructure being validated by the Joint Requirements Oversight Council, it still faces opposition from Congress and some in industry who seek to perpetuate the problem, and their government contracts, by obfuscating any simple solution to the state of being data rich and information poor.


In support of the 2018 National Defense Strategy and the three lines of effort in quotations below, I make the following recommendations to Secretary James Mattis. First, clearly define the problem of the data rich and information poor and how this problem impedes efforts to build a more lethal force. Tell a powerful story in a way that all users of information systems can understand, whether a front line warfighter or an influential member of congress. Second, take control of data across the Department of Defense by releasing an Information Dominance Strategy and Directive that control information systems acquisitions according to these objectives. This step would greatly reform the department for greater performance and affordability. Third, define authoritative data sources and bring them to a Department-wide cloud in an agile method focused on warfighter needs. By providing data on a secure cloud, the Department of Defense would be well positioned to strengthen alliances and attract new partners within other agencies of the U.S. government, in addition to members of the international community that share U.S. security values.

Those who idly repeat that the nature of warfare is changing may have already missed their opportunity to institute these reforms into the Department of Defense. Problems like being data rich and information poor were identified decades ago and will require bureaucracy busters who act with a sense of urgency to support this mission requirement. I salute those who accept this challenge and long for the day DRIP is completely destroyed from the Department of Defense so that no adversary would ever think of confronting the information driven capability of the U.S. military.

Geoff Weber spent eleven years in the private sector before, motivated by the events of 9/11, earned a commission in the U.S. Navy. He has worked since in a variety of roles, all using information dominance to support decision making superiority. He is currently on a Legislative Affairs Fellowship at the U.S. Senate in Washington, DC. The contents of this article reflect his personal views and are not necessarily endorsed by the Department of the Navy, the Department of Defense, or the US Government.

This article appeared originally at Strategy Bridge.


[1] Sun Tzu, The Illustrated Art of War, trans. Samuel B. Griffith (Oxford, UK: Oxford University Press, 2005), 125.

[2] Jim Garamone, “Dunford: Speed of Military Decision-Making Must Exceed Speed of War.” Joint Forces Quarterly, 1st Quarter 2017.

[3] Bill Gates. Business @ the Speed of Thought: Using a Digital Nervous System. New York: Warner, 1999, 154-155.

[4] U.S Office of the Chairman of the Joint Chiefs of Staff., Joint Publication 3-0: Joint Operations, January 17, 2017, III-15

[5] Aaron Mehta. “Pentagon Tech Advisors Target How the Military Digests Data.” Defense News, April 6, 2017.

[6] Jack Sheehan. Data Provisioning Using Authoritative Data Sources. Proceedings from the 3rd Simulation Based Acquisition Conference, May 15-17, 2001. Arlington, VA: National Defense Industrial Association, 2001.

[7] Sun Tzu, Art of War, 125.

[8] U.S. Government Accountability Office, Language and Culture Training: Opportunities Exist to Improve Visibility and Sustainment of Knowledge and Skills in Army and Marine Corps General Purpose Forces—Report to Congressional Committees, Washington, D.C., GAO-12-50, October 2011b. As of May 29, 2012:

[9] V.N. Gorbunov and S.A. Bogdanov. “Armed Confrontation in the 21st Century.” Military Thought, Volume 1, 2009, 19.

[10] Carl von Clausewitz, Michael Howard and Peter Paret. On War. Princeton: Princeton University Press, 1976.

[11] Ibid.

[12] Michael Peterson. “Data Transparency: Empowering Decisionmakers.” Joint Force Quarterly, Issue 49, 2nd Quarter 2008

[13] Aaron Mehta. “Pentagon Tech Advisors Target How the Military Digests Data.” Defense News, April 6, 2017.

Show comments Hide Comments