Post Process

Everything to do with E-discovery & ESI

Archive for the ‘Trends’ Category

Preview of MS Office 2010

Posted by rjbiii on July 13, 2009

Microsoft has announced details of upgrades to its office suite, and PC Pro Posts a preview here. Among other things, the article discusses changes to Outlook:

As far as the desktop applications are concerned, the Ribbon interface first introduced with Office 2007 has now been rolled out across every application, including Outlook.

Outlook also sees the introduction of two new email features for office workers drowning under a deluge of email. The Conversation Clean-Up tool will condense long email chains into summaries of the conversation, allowing you to catch up with all the key information without having to open dozens of different messages individually.

Outlook will also have a new Ignore Conversation feature that allows users to opt-out of round-robin emails that don’t concern them. Adams gives the example of a long email discussion about a dinner engagement that you know you won’t be able to attend. One click of the Ignore Conversation button will junk any further emails on that topic.

Other changes include an entire line of web-based apps, a la Google; better image editing within Word and PowerPoint; and fewer licensing categories (reducing the number of ‘versions’ of the suite from its current eight to five).

H/T: Slashdot

Advertisements

Posted in Articles, MS Office, Technology, Trends | Leave a Comment »

Cell Phone Data becoming a Factor in the Court Room

Posted by rjbiii on July 8, 2009

From the New York Times:

The pivotal role that cellphone records played in [] two prominent New York murder trials this year highlights the surge in law enforcement’s use of increasingly sophisticated cellular tracking techniques to keep tabs on suspects before they are arrested and build criminal cases against them by mapping their past movements.

But cellphone tracking is raising concerns about civil liberties in a debate that pits public safety against privacy rights. Existing laws do not provide clear or uniform guidelines: Federal wiretap laws, outpaced by technological advances, do not explicitly cover the use of cellphone data to pinpoint a person’s location, and local court rulings vary widely across the country.

H/T: Slashdot

Posted in Articles, Cell Phones, Data Sources, Technology, Trends | Leave a Comment »

On the discoverability of voicemail

Posted by rjbiii on May 25, 2009

Mark Sidoti and Paul Asfendis, writing for Law.com, have recently posted an article discussing the different types of voicemail, and the discoverability of each:

Companies today have more options than ever for generating, receiving, storing, retrieving and disposing of voicemail messages.

In the past, voicemails were stored on analog tapes, but increasingly, organizations now use unified, digital systems that integrate telephone and computer systems. While more efficient and flexible, these advances raise a number of electronic data discovery issues.

If your organization is considering an upgrade, it’s imperative to evaluate the effect, if any, that the new system will have on your obligation to preserve, search and disclose relevant voicemail messages.

The authors discuss the differences between analog and digital systems. They also compile a list of “challenges” for the organization deciding to implement a “unified” v-mail system (that is, a digital system that is integrated with the IT Enterprise). Well worth the read.

Posted in Articles, Technology, Trends, Voice Mail Systems | Leave a Comment »

Comparing Discovery in Canada and the US

Posted by rjbiii on April 26, 2009

Byte and Switch has a nice post discussing the differences between Canadian and US discovery processes. From the blog:

I expected litigation and e-discovery to be closer than it really is. Here is our overview of the situation: The Canadian provinces exert tremendous control over e-discovery practices and procedures in common and civil law. There is no corresponding natural statute such as the U.S. Federal Rules of Civil Procedure, making e-discovery in Canada difficult to affect on a unified national principle.

There are national Canadian guidelines with the publication of Sedona Canada’s e-discovery principles and the Judicial Council’s practice direction for e-discovery in civil courts. (“The Sedona Canada Principles Addressing Electronic Discovery” and “National Model Practice Direction for the Use of Technology in Civil Litigation,” respectively.) These principles and guidelines for court practice are excellent steps forward and provide guidance for provinces that are developing their own sets of e-discovery rules. (British Columbia, Nova Scotia, Alberta, and Ontario have well-developed principles or drafts, and other provinces are no doubt busy as well.) Neither is statutory, and they exist as guidelines to implementation.

The post states that U.S. attorneys can learn from the Canadians’ attempts at reigning in costs, while Canadians could benefit by using proper tools.

Posted in Articles, International Issues, Trends | Tagged: | Leave a Comment »

FTC Unveils New ‘Red Flags’ Website

Posted by rjbiii on April 8, 2009

The Red Flags rule, designed to tighten data security and fight ID theft, come into force on May 1. The FTC has launched a web site designed to help businesses determine if they need to comply, and how to do so.

According to the agency’s “How-to” guide (click here for a pdf version), the Red Flags rule mandates:

  • The establishment of a program that includes reasonable policies and procedures
    to identify the “red flags” of identity theft you a business may run across during its day-to-day operations.
  • the Program implemented must be designed to detect the specific red flags that have been identified.
  • the prorgram implemented spell out appropriate actions that will be taken red flags are detected.
  • a process to re-evaluate current policies and programs
  • implementation of policies into business practices.

Those institutions who must comply with the new rule include:

  • Financial Institutions; and
  • Creditors (entities who regularly grant or arrange loans or extend credit to consumers or businesses, or make “credit decisions.”)

The rules were initially slated to become effective on November 1, 2008, but the FTC granted businesses a six-month delay. That reprieve is now ending, however.

Posted in Articles, Compliance, Information Governance, Red Flags Rule, Trends | Leave a Comment »

Doubts about Self-Regulation

Posted by rjbiii on April 7, 2009

Info World has posted an article casting doubt on the wisdom of using self-regulation to ensure compliance. The article highlights a story in which Macy’s has refused to provide contact information for customers who bought toy necklaces later found to contain lead. From the article:

Macy’s was one of the retailers that pulled the necklaces. But when L.A. Deputy District Attorney Daniel Wright asked for the records of customers who bought the necklaces, Macy’s refused to turn over any information. At issue is the ability to notify parents who purchased the necklaces for their children.

The article speculates that the reason for Macy’s refusal may be that the retailer is not in compliance with Payment Card Industry standards. That aside, the bottom line is that self-regulation is being given a black eye.

A study released in December of 2008 pointed out issues with respect to the EU-Dept. of Commerce Safe Harbor scheme. That study claimed that only 22% of those companies that were “self-certified” as compliant to safe harbor principles were actually compliant. The report’s basic conclusion was that the program had been ineffective.

The operational rationale behind self-regulation is undermined when we see figures such as those reported above. Information Technology’s best practices contain, as a substantial portion of its foundation, the principles embodied in active self-regulation. Recent events, from the collapse of the financial sector, to the misdeeds behind the situation facing mortgagees, illustrate the limits to self-regulation, and recall to our consciousness the maxim: trust…but verify.

Posted in Articles, Compliance, Self-Regulation, Trends | Tagged: , | Leave a Comment »

Automating Science

Posted by rjbiii on April 5, 2009

Post Process has, in the past, posted small articles on the changes in technology that are transforming society. We pointed to the new field of computational journalism. We also put up a post discussing the “age of the petabyte,” in which we discussed the consequences of having so much data available. In that post, we highlighted a defense attorney’s use of Google analytics to more objectively examine a community’s definition of “obscenity.”

Now, from Wired, comes word that a computer (or robot, if you like) not only stores and analyzes facts, but uses the presence of the data now available to discover laws of physics on its own. That is, the discovery is made by the computer, rather than a human being:

“It’s a powerful approach,” said University of Michigan computer scientist Martha Pollack, with “the potential to apply to any type of dynamical system.” As possible fields of application, Pollack named environmental systems, weather patterns, population genetics, cosmology and oceanography. “Just about any natural science has the type of structure that would be amenable,” she said.

Compared to laws likely to govern the brain or genome, the laws of motion discovered by the program are extremely simple. But the principles of Lipson and Schmidt’s program should work at higher scales.

The researchers have already applied the program to recordings of individuals’ physiological states and their levels of metabolites, the cellular proteins that collectively run our bodies but remain, molecule by molecule, largely uncharacterized — a perfect example of data lacking a theory.

Their results are still unpublished, but “we’ve found some interesting laws already, some laws that are not known,” said Lipson. “What we’re working on now is the next step — ways in which we can try to explain these equations, correlate them with existing knowledge, try to break these things down into components for which we have clues.”

One wonders if the “automation” of scientific discoveries will now bring such a rapid pace of discovery, that we will all be left behind, hopeless swimming against an ever-stronger tide.

Posted in Articles, Technology, Trends | Leave a Comment »

The Search Engine as Electronic Brain: Wolfram Alpha goes Live in May

Posted by rjbiii on March 11, 2009

CNet blogger Dan Farber discusses the upcoming release of Stephen Wolfram’s latest venture: a new search engine that is being touted as a breakthrough:
[Entrepreneur Nova] Spivack gave some insight as to how the Wolfram’s search engine works:

Wolfram Alpha is a system for computing the answers to questions. To accomplish this it uses built-in models of fields of knowledge, complete with data and algorithms, that represent real-world knowledge.

For example, it contains formal models of much of what we know about science — massive amounts of data about various physical laws and properties, as well as data about the physical world.

Based on this you can ask it scientific questions and it can compute the answers for you. Even if it has not been programmed explicity to answer each question you might ask it.

But science is just one of the domains it knows about–it also knows about technology, geography, weather, cooking, business, travel, people, music, and more.

It also has a natural language interface for asking it questions. This interface allows you to ask questions in plain language, or even in various forms of abbreviated notation, and then provides detailed answers.

The vision seems to be to create a system which can do for formal knowledge (all the formally definable systems, heuristics, algorithms, rules, methods, theorems, and facts in the world) what search engines have done for informal knowledge (all the text and documents in various forms of media).

As the article mentions, Wolfram is the creator of Mathematica, and the writer of a book (not always warmly received) entitled A New Kind of Science.

Posted in Articles, Search Engine Technology, Technology, Trends | Tagged: , | Leave a Comment »

Microsoft introduces Gazelle: the Web Browser as O/S

Posted by rjbiii on February 22, 2009

Microsoft has released a paper introducing Gazelle (Abstract here; Complete PDF Paper here):

Web browsers have evolved to be a multi-principal operating environment where a principal is a web site. Similarly to a multi-principal OS, recent proposals and browsers like IE 8 and Firefox 3 advocate and support abstractions for cross-principal communication (e.g., PostMessage) and protection (for frames) to web programmers. Nevertheless, no existing browsers, including new architectures like IE 8, Google Chrome, and OP, have a multi-principal OS construction that gives a browser-based OS, typically called Browser Kernel, the exclusive control to manage the protection and fair-sharing of all system resources among browser principals.

In this paper, we present a multi-principal OS construction of a secure web browser, called Gazelle. Gazelle’s Browser Kernel exclusively provides cross-principal protection and fair sharing of all system resources.

This document limits its discussion exclusively to its unique resource protection architecture.

[HT: Slashdot]

Posted in Articles, Browsers, Technology, Trends | Tagged: , | Leave a Comment »

Computing is becoming a foundation for many professions

Posted by rjbiii on January 6, 2009

Post Process has blogged in the past (many times) about the importance of understanding computing, logic structures, storage, and associated topics for e-discovery professionals. Law is becoming digitized, even if it is against the will of a significant portion of those in practice. The challenges of e-discovery, however, are merely the symptom of something bigger. Society is being transformed on multiple fronts. The area of my concern is, of course, information technology. The spread of computers and the advent of (nearly) global connectivity are creating a revolution that not only offers intriguing promise, but also very difficult challenges. Transformational technologies are also disruptive, by their nature. To get a glimpse into the changes, take a look at an article posted by John Mecklin of Miller-McCune, discussing computing’s encroachment into the world of investigative journalism, and the new field of computational journalism.

Now, though, the digital revolution that has been undermining in-depth reportage may be ready to give something back, through a new academic and professional discipline known in some quarters as “computational journalism.” James Hamilton is director of the DeWitt Wallace Center for Media and Democracy at Duke University and one of the leaders in the emergent field; just now, he’s in the process of filling an endowed chair with a professor who will develop sophisticated computing tools that enhance the capabilities — and, perhaps more important in this economic climate, the efficiency — of journalists and other citizens who are trying to hold public officials and institutions accountable.
[…]
Bill Allison, a senior fellow at the Sunlight Foundation and a veteran investigative reporter and editor, summarizes the nonprofit’s aim as “one-click” government transparency, to be achieved by funding online technology that does some of what investigative reporters always have done: gather records and cross-check them against one another, in hopes of finding signs or patterns of problems. Allison has had a distinguished career, from his work as an investigative reporter at The Philadelphia Inquirer to his investigative duties at the Center for Public Integrity, where he co-authored The Cheating of America with legendary center founder Charles Lewis. Before he came to the Sunlight Foundation, Allison says, the notion that computer algorithms could do a significant part of what investigative reporters have always done seemed “far-fetched.”

But there’s nothing far-fetched about the use of data-mining techniques in the pursuit of patterns. Law firms already use data “chewers” to parse the thousands of pages of information they get in the discovery phase of legal actions, Allison notes, looking for key phrases and terms and sorting the probative wheat from the chaff and, in the process, “learning” to be smarter in their further searches.

The point is that while we often hear complaints on how difficult e-discovery is, the legal profession (among others) is being transformed. Issues that we face in our own workspaces are often due to societal trends over which we do not have control. There are pain points everywhere, but sometimes just stepping back and looking at the big picture can be rewarding.

Posted in Articles, Data Mining, Technology, Trends | Tagged: | Leave a Comment »