Hack/reduce lends a hand in Big Data security

Nonprofit said Monday it had hired as executive director Abby Fichtner, an ex-Microsoft evangelist

Hack/reduce, a Boston-based nonprofit organization that acts as a springboard for Big Data projects, will host efforts to develop a security layer that would sit on top of the Hadoop open source database.

Startup Sqrrl will move into the hack/reduce office to lead the team working on the Accumulo project, Frederic Lalonde, co-founder of the non-profit and the founder of the Hopper travel site, said on Monday. Work is expected to begin soon.

Accumulo is under the open-source group The Apache Software Foundation. The data scientists who started the Apache Accumulo project also founded Sqrrl.

In general, Sqrrl is building a secure, scalable and efficient discovery layer on top of the Accumulo platform, which was originally developed by the National Security Agency (NSA).

Accumulo provides security for each cell in a table, an important function for organizations that need strict data control, such as in health care and law enforcement.

"This is the first Big Data database that's been built with security as the primary concern," Lalonde said of Sqrrl's work.

[See also: Database security - At rest, but not at risk]

Hack/reduce is scheduled to launch officially next month. On Monday, the non-profit announced that it had hired as executive director Abby Fichtner, an ex-Microsoft evangelist who was in charge of building relationships with startups.

"She was hired to run this organization, which is a full-blown nonprofit," Lalonde said. "This is not an incubator. We don't take a percentage of companies. You don't have to have a startup to come here."

Hack/reduce has raised about $3 million from venture capital firms and tech companies, such as IBM, Dell and EMC. Co-founder Chris Lynch of Atlas Venture has led the fundraising. The third founder is Steve Papa, chief executive of Endeca Technologies. Papa started Endeca in 1999 and sold it to Oracle last year.

Big Data is a loosely defined term that is used to describe the huge, complex data sets found in meteorology, genomics, biological and environmental research and many other disciplines. Because the amount of data goes beyond the capacity of traditional database management tools, researchers and startups are working on new software for slicing and dicing the data for more efficient and accurate analysis.

Part of hack/reduce's effort is to provide data sets of sufficient size for researchers to experiment on, as well as the necessary computing power. "Hack/reduce is also about creating a place where people who are passionate about the data can meet programmers interested in finding a new project to work on," Lalonde said.

Before setting up house in Boston, hack/reduce had operated since 2011 as a roving series of Hadoop hackathons organized by Hopper. The new organization will be run by a board of directors with an advisory board set up to help vet projects.

Join the discussion
Be the first to comment on this article. Our Commenting Policies