Saturday, July 25, 2009

Trident Warrior 2009

Here is link to a great YouTube video about Trident Warrior 2009.
http://www.youtube.com/watch?v=z_CjJs-1PEc

Wednesday, July 22, 2009

Maritime Domain Awareness

LT Munson makes some good points in his Proceedings article, namely that reliance on identifying anomalous behavior should not be the critical element in developing actionable MDA. Countering the wide range of threats from traditional adversary militaries to non-traditional threats like terrorism, WMD proliferation, drug smuggling, weapon smuggling, low intensity conflict, and piracy requires a wide range of information. Although this is not a new problem, no one nation has the resources required to go it alone. There are both policy and technical challenges. The first is about trust building, the latter focuses on MDA data fusion and analysis.

Clearly the creation of an interagency and international MDA network composed of analysis centers, operators, information repositories, and sensors that continuously monitor the globe searching for burgeoning maritime threats makes sense. This concept aligns well with the Navy's Strategy for Maritime Security for expanded cooperative relationships with other nations who putatively contribute to the rising tide of maritime security for the benefit of the global maritime common. Although as LT Munson points out the current emphasis is on ship positions, this is more because of the technological advent of AIS and the astonishing success of the Maritime Safety and Security Information System (MSSIS) where participating countries freely share unclassified, near real-time AIS data. A collaborative network of coalition partners that will be eventually be able to monitor vessels, cargo, people, finance, and infrastructure, requires establishing relationships and trust in addition to the creation of a physical network. MSSIS and similar efforts are recognition that trust and cooperation cannot be surged, they must be built over time. Recognition that international and interagency partners require access to and the ability to collaboratively process, analyze, and disseminate information on maritime threats has led to the development of a blueprint for a net-centric information environment in which data from disparate sources and security domains will be discoverable, accessible, understandable, fused, and usable, with appropriate information assurance, to enable user defined and common operational pictures. This blueprint is described in the Maritime Domain Awareness Architecture Management Hub Plan.

The technology challenge is related to the explosive growth in the generation and collection of data associated with vessels, cargo and people which will potentially require a new generation of techniques and tools that can assist in transforming these data into actionable intelligence. These tools include preprocessing large amounts of data, data reduction, mining and fusion. Data fusion is generally defined as the use of techniques that combine data from multiple sources and gather that information in order to achieve inferences, which will be more efficient and potentially more accurate than if they were achieved by means of a single source. The well recognized data fusion model developed by the Joint Directors of Laboratories Data Fusion Group provides a common frame of reference for fusion discussions. The point being that while anomaly detection is a component of Level 3, the objective of fusing the combined activity and capability of vessels, cargo, people, finance, and infrastructure is to infer intentions and assess threats.

The vision for a global network-centric environment where users can access, analyze, and exchange a wide range of maritime information and collaborate on maritime problems with others is absolutely consistent with the argument made by Defense Secretary Robert Gates that "the most important military component in the war on terror is not the fighting we do ourselves, but how well we enable and empower our partners to defend and govern themselves."

Sunday, June 14, 2009

The Dumbest Generation

From the Economist June 11 :
A recent report from McKinsey, a management consultancy, argues that the lagging performance of the country’s school pupils, particularly its poor and minority children, has wreaked more devastation on the economy than the current recession. American children have it easier than most other children in the world, including the supposedly lazy Europeans. They have one of the shortest school years anywhere, a mere 180 days compared with an average of 195 for OECD countries and more than 200 for East Asian countries. German children spend 20 more days in school than American ones, and South Koreans over a month more. Over 12 years, a 15-day deficit means American children lose out on 180 days of school, equivalent to an entire year. American children also have one of the shortest school days, six-and-a-half hours, adding up to 32 hours a week. By contrast, the school week is 37 hours in Luxembourg, 44 in Belgium, 53 in Denmark and 60 in Sweden. On top of that, American children do only about an hour’s-worth of homework a day, a figure that stuns the Japanese and Chinese. More at http://www.economist.com/world/unitedstates/displayStory.cfm?story_id=13825184

Saturday, March 28, 2009

Lead System Integrators

EXECUTIVE SUMMARY: In the wake of recent cost overruns, schedule slips, and performance shortfalls, there is a growing concern about using private-sector lead system integrators (LSIs) for the execution of large, complex, defense-related acquisition programs. Abandoning the LSI approach will challenge the Services, because in most cases they lack the ability to manage complex programs on their own.

DISCUSSION:
In recent years, DOD acquisitions have turned to the LSI concept in large part because they have determined that they lack the in-house, technical, and project-management expertise needed to execute large, complex acquisition programs. Because private-sector firms often have better knowledge and expertise of rapidly developing commercial technologies, LSI arrangements can promote better technical innovation and overall system optimization.
LSI proponents argue the only way a complex program with numerous component contracts can be delivered, is by contracting for a seamless single integrated program led by the LSI which effectively streamlines the downselect and procurement process. In an LSI arrangement, the federal government has a contractual relationship with the LSI prime contractor, not with any subcontractors that report to the prime contractor. This lack of transparency makes government management and oversight of an acquisition program more difficult and increases the risk of cost overruns, schedule slippage, poor product quality (e.g. lack of interoperability), and inadequate system performance.
Of course, cost overruns, schedule slips, and performance shortfalls have plagued large weapon system acquisition programs since World War II, so LSI’s maybe a scapegoat for more fundamental contributing factors to these problems including requirements creep, funding instability and immature technologies.
“We’ve relied too much on contractors to do the work of government as a result of tightening budgets, a dearth of contracting expertise in the federal government, and a loss of focus on critical governmental roles and responsibilities in the management and oversight of acquisition programs,” Coast Guard Commandant Adm. Thad Allen said last April.1 In recent years, House Armed Services Committee members have been concerned about ceding too much program management to contractors, allowing cost overruns, schedule delays and other problems to go undetected until too late. HASC Chairman Gene Taylor (D-Miss.) said last month he wants to abandon the use of a lead system integrator. 2
In a Capitol Hill hearing March 26 on the nomination of Ashton Carter to be the next Pentagon acquisition chief, SASC Chairman Sen. Carl Levin (D-Mich.), took DOD to task for failing to be able report on its services outsourcing until 2011. “That’s a real problem,” Levin said. “We have contracted out so much of the services needed that we can’t even inventory the services for years.” 3
All evidence suggests it could take up to a decade to rebuild the Navy’s roster of competent engineers and acquisition managers to a level capable of managing a program like LCS. Because DOD does not have the requisite work force, the likely outcome is some revamping of the LSI construct. 4

Notes and Links:
1 Bennett, John T. U.S. reasserts control over contractors, Despite Deepwater Takeover, Many Say Gov’t Lacks Skills To Run Programs, Defense News, April 23, 2007 http://integrator.hanscom.af.mil/2007/April/04262007/04262007-21.htm
2 Kivlan, Terry. Lawmaker lays down markers on fiscal 2010 shipbuilding budget, Congress Daily , February 5, 2009
http://www.govexec.com/dailyfed/0209/020509cdpm1.htm
3 Chavanne, Bettina,H. DoD Acquisition Work Force Insufficient, Aviation Week's , March 27, 2009 http://www.military.com/features/0,15240,187701,00.html?wh=news
4 Grasso, Valerie B. Defense Acquisition: Use of Lead System Integrators (LSIs) - Background, Oversight, Issues, and Options for Congress Congressional Research Services (RS 22631), Jan 10, 2009 http://digital.library.unt.edu/govdocs/crs/data/2008/meta-crs-10699.tkl

Friday, March 27, 2009

Task Force Mountain

Check out http://www.taskforcemountain.com/ which looks to be a forward leaning approach to organizational communication - in the military.

Tuesday, March 10, 2009

Wolfram Alpha: 'A new paradigm for using computers and the web'

Another week another Google killer. Last week, it was Twitter as Google killer. This week it’s Wolfam Alpha. The difference with Wolfram Alpha is that it has the pedigree, engineering heft and perhaps a better mousetrap to actually live up to the billing.
Techmeme is a flutter with talk of Wolfram Alpha. Dan Farber notes that Stephen Wolfram is a scientist who has recorded a few breakthroughs and a little controversy. In a nutshell, Wolfram Alpha blends natural language, a new search model and an algorithm that takes all the data on the Web and makes it “computable.” Wolfram just recently outlined his latest creation and added:
I think it’s going to be pretty exciting. A new paradigm for using computers and the web.
Dan writes about Wolfram:
He received his Ph.D. in theoretical physics from Caltech in 1979 when he was 20 and has focused most of his career on probing complex systems. In 1988 he launched Mathematica, powerful computational software that has become the gold standard in its field. In 2002, Wolfram produced a 1,280-page tome, A New Kind of Science, based on a decade of exploration in cellular automata and complex systems.
In May, Wolfram will launch Wolfram Alpha, which is dubbed a computational knowledge engine. It’s pretty clear, what Web giant Wolfram Alpha is targeting:

Look familiar?
For the brainiacs in the house, Nova Spivak has a long post outlining Wolfram Alpha (it’s a must read). Simply put, if Spivak’s outline is only half on target Wolfram Alpha could be big.
Spivak writes:
In a nutshell, Wolfram and his team have built what he calls a “computational knowledge engine” for the Web. OK, so what does that really mean? Basically it means that you can ask it factual questions and it computes answers for you.
It doesn’t simply return documents that (might) contain the answers, like Google does, and it isn’t just a giant database of knowledge, like the Wikipedia. It doesn’t simply parse natural language and then use that to retrieve documents, like Powerset, for example.
Instead, Wolfram Alpha actually computes the answers to a wide range of questions — like questions that have factual answers such as “What country is Timbuktu in?” or “How many protons are in a hydrogen atom?” or “What is the average rainfall in Seattle this month?,” “What is the 300th digit of Pi?,” “where is the ISS?” or “When was GOOG worth more than $300?”
Think about that for a minute. It computes the answers. Wolfram Alpha doesn’t simply contain huge amounts of manually entered pairs of questions and answers, nor does it search for answers in a database of facts. Instead, it understands and then computes answers to certain kinds of questions.
Spivak later mentions that Wolfram Alpha isn’t designed to be HAL 9000. That’s refreshing. Wolfram Alpha sounds impressive, but it would be premature to call it a Google killer though. In fact, if Wolfram Alpha lives up to its billing it will be acquired at some ridiculous price either by Google or some company—Microsoft—looking to kill Google.