You may have heard that the organization known as europe-v-facebook found that a little-know provision in Facebook's privacy and user rights policies allowed a vote on proposed changes to be forced if over 7,000 respondents were interested and submitted comments to that effect. When europe-v-facebook publicized this, users swarmed to show support and get their voices heard in discussions when Facebook wanted to update its policies.

After a flood of around 40,000 comments were posted regarding the proposed changes to Facebook's Statement of Rights and Responsibilities and Data Use Policy, a vote site was rolled out to allow more users to weigh in regarding the changes. In response, the folks at our-policy.org, trotted out a wish list of a variety of changes they'd like to see. Some of the ideas included:

  1. Opt-in required for each feature Facebook wants to share of users' information, rather than opt-out. This would allow users to control what information they wanted to allow to be shared, rather than the other way around, where users have to continually monitor for changes that would potentially allow more data-sprawl than they are comfortable with.
  2. Users would like a complete list of all data categories and subcategories Facebook has for users, and the ability to get a full list of all the data they have on a given user within 40 days
  3. Truly, permanently delete data that a user deletes
  4. Limit usage of user data by third parties, and/or explain the usage and allow users to opt-out.

And a host of others.

But in the end, nothing happened. Facebook did not adopt the proposed changes and instead went ahead with its own changes.Why? Turns out the voting provision requires 30% of Facebook users to vote in favor for something to be done, a number that tops 300 million users, no small number. The folks at europe-v-facebook.org lament that Facebook's voting "mechanism was very well hidden on the platform, making sure that most users don’t even know about it.", Therefore, they opine that Facebook seemed less-than-zealous about seeking crowd-based input.

Of the relatively small number of Facebook users that did vote, 87% supported keeping the existing policies in place. So can the 'crowd' really have a seat at the table of the social media juggernaut - practically speaking? That remains to be seen. If similar efforts can gain more publicity then presumably they could force an issue and earn a say in the process. But how do you organize 300 million people in a short time to adopt a cohesive voice?

That question is probably at the forefront of discussion for various grass roots organizations who'd like to help shape how their data is shared across the site and with third parties, and weigh in on privacy issues. One obvious suggestion is to persuade Facebook to tell its users when there is a vote, just as it tells users about other things the company considers important.

Now that Facebook is a publicly listed company, there may be opportunities to exert pressure from other directions, like investors. Still, sharing data can be a lucrative activity and one could argue that the profitability of the company depends on it. But what about the 'crowd' of normal users who would like a more active role in helping shape the trajectory of their own data? Will Facebook use this failed effort as a touchstone of user sentiment and adopt programs to seek more input? Lots of folks hope so - we'll wait and see.

In the meantime, it's a good idea to take the matter into your own hands, and - if it suits your own wishes - ratchet down the amount and type of information that's shared on your Facebook profile, which you can do by following this Facebook privacy primer here, and a more general security primer here.