慶應環境情報 2015 II
II
1
On November 2, 2010, Facebook's American
users were subject to an ambitious experiment in civic-engineering: Could a
social network get people to vote in that day's elections?
2
The answer was yes.
3
The way to [31 ] (1. nudge 2. shake 3.
stroke) bystanders to the voting booths was simple. It consisted of a graphic
containing a link for looking up voting places, a button to click to announce
that you had voted, and the profile photos of up to six Facebook friends who
had indicated they'd already done the same. [32](L Against 2. With 3. Beyond)
Facebook's cooperation, the political scientists who conducted the study
planted that graphic in the newsfeeds of tens of millions of users. Other
groups of Facebook users were shown a [33](1. generic 2. generous 3. genetic)
get-out-the-vote message or received no voting reminder at all. Then the
researchers compared their subjects' names with the day's actual voting records
to measure how much their voting prompt increased participation.
4
Overall, users who were notified of their
friends' voting were 0.39 percent more likely to vote than those in the other
group, and any resulting decisions to vote also appeared to spread to the
behavior of close Facebook friends, even if those people hadn't received the
original message. That small increase in voting rates [34](1. amounted to 2.
contrasted with 3. passed up) a lot of new votes. The researchers concluded
that their Facebook graphic directly mobilized 60,000 voters, and, thanks to
the ripple effect, ultimately caused an additional 340,000 votes to be cast
that day.
5
Now consider a hypothetical, [35](1. coolly
2. hotly 3. warmly) contested future election. Suppose that the CEO of Face
book personally favors whichever candidate you don't like. He arranges for a
voting prompt to appear within the newsfeeds of tens of millions of active
Facebook users―but unlike in the 2010 experiment, the group that will not
receive the message is not chosen at random. Rather, he makes use of the fact
that Facebook "likes" can predict political views and political party
affiliation, even [36](1. before 2. beneath 3. beyond) the many users who
include that information in their profiles already. With that knowledge, he
could choose not to change the feeds of users who don't agree with his views.
This could then [37](1. flap 2. flip 3. flop) the outcome of the election.
Should the law constrain this kind of behavior?
6
The scenario imagined above is an example
of digital gerrymandering. All sorts of factors [38](1. contend with 2.
contrast with 3. contribute to) what Facebook or Twitter present in a feed, or
what Google or Bing show us in search results. Our expectation is that those
companies will provide open access to others" content and that the
variables in their processes just help [39](1. field 2. wield 3. yield) the
information we find most relevant. Digital gerrymandering occurs when a site
instead distributes information in a manner that serves its own political
agenda. This is possible on any service that personalizes what users see or the
order in which they see it, and it's increasingly easy to do.
7
There are plenty of reasons to regard
digital gerrymandering as so dangerous that no right-thinking company would
attempt it. But none of these businesses actually promise [40](1. accuracy 2.
neutrality 3. partiality). And they have already shown themselves willing to
leverage their awesome platforms to attempt to influence policy. In January
2012, for example, Google blacked out its home page "doodle" (the
logo graphic at the top of the page) as a protest [41 ](1. against 2. by 3.
for) the pending Stop Online Piracy Act (SOPA) in the US, which they thought
would cause censorship. The altered logo linked to an official blog [421(1. entrance
2. entree 3. entry) asking Google users to contact Congress to complain; SOPA
was ultimately abandoned, just as Google and many others had wanted. A
social-media or search company looking to take the [43](1. first 2. last 3,
next) step and attempt to create a favorable outcome in an election would certainly
have the means.
8
So what's stopping that from happening? The
most important fail-safe is the threat that a significant number of users,
outraged by a betrayal of trust, would start using different services, hurting
the company's income and reputation. [44](1. However 2. Meanwhile 3. Moreover),
although a Google doodle lies in plain view, newsfeeds and search results have
no standard form. They can be subtly [45](1. teased 2. tickled 3. tweaked)
without anyone knowing. Indeed, in our get-out-the-vote hypothetical situation
above, the people with the most reason to complain would be those who weren't
given the prompt and may never know it existed. Not only that, but the policies
of social networks and search engines already state that the companies can
change their newsfeeds and search results however they like. An effort to
change voter participation could be covered by the existing user agreements and
require no special notice to users.
9 [46](1. At the same time 2. By the way 3.
More to the point), passing new laws to prevent digital gerrymandering would be
a bad idea. People may be due the benefits of a democratic electoral process,
but in the United States, both people and corporations also have a First
Amendment right to free speech―and to present their content as they [47](1.
know 2. see 3. wish) fit. Meddling with how a company gives information to its
users, especially when the information is not false, is asking for trouble.
10
There's a better solution available:
requiring web companies entrusted with personal data and preferences to act as
"information fiduciaries*." Just as a doctor or lawyer is not allowed
to use information about his or her [48](1. patents 2. paticnce 3. patients) or
clients for outside purposes, web companies should also be prohibited from
doing this.
11
As things stand, web companies are simply
bound to follow their own privacy policies. Information fiduciaries would have
to do more. For example, they might be required to keep information about when
the personal data of their users is shared with another company, or is used in
a new way. They would provide a way for users to switch to unadulterated search
results or newsfeeds to see how that content would appear if it were not
personalized. And, most important, information fiduciaries would promise not to
use any formulas of personalization based on their own political goals.
12
Four decades ago, another emerging
technology had Americans worried about how it might be manipulating them. In
1974, there was a panic over the possibility of subliminal messages in TV
advertisements. As a result, the Federal Communications Commission prohibited
that kind of communication. There was a [49](1. floor 2, foundation 3. foot)
for that rule; historically, broadcasters have accepted a responsibility to be
fair in exchange for licenses to use the public airwaves. The same duty of
audience protection ought to be brought to today's dominant medium. As more and
more of what shapes our views and behaviors comes from invisible,
artificial-intelligence-driven processes, the worst-case [50](1. scenarios 2.
scenes 3. situations) should be placed off limits in ways that don't become
restrictions on free speech. Our information intermediaries can keep their sauces
secret, inevitably advantaging some sources of content and disadvantaging others,
while still agreeing that some ingredients are poison―and must be off the table.
[51] A "ripple effect" as used in
the 4th paragraph is best described by the way in which
1. the differences between things can
gradually become blurred.
2. a message becomes distorted by being
passed through many people.
3. a small change in one area can result in
a big change elsewhere.
4. the effects of an action can continue
and spread long after the event.
[52] Which of the following would be an
example of "digital gerrymandering" as described in the 6th
paragraph?
1. A well-known businessperson sends an
email to all of his or her company's customers in a certain region endorsing a
particular local political candidate.
2. A social network hides posts about a
certain state representative from the newsfeeds of network users who live
outside of the politician's home state.
3. A search engine lists positive articles
about a law the search company supports higher on the page for users in areas
where the law is less popular.
4. A company posts an essay on its home
page urging people to vote against a new law that would force the company out
of business.
[53] The story about Google and SOPA in the
7th paragraph is used as an example of an Internet company doing which of the
following?
1. Protecting the free speech rights of its
users.
2. Removing content that contradicts the
company's philosophy.
3. Violating users' privacy for the purpose
of political change.
4. Using its influence to make a political
statement.
[54] Which of the following is an
implication of the last two sentences of the 8th paragraph?
1. Users have no legal grounds for
complaining if an Internet company secretly manipulates them for political
purposes.
2. Internet companies routinely cite their
user agreements as justification for altering their content for political
purposes.
3. Users should have a right to vote on the
policies of the Internet services they use, but they are prevented from doing
so by the terms of use.
4. Internet companies have secretly added
policies allowing them to manipulate voter participation into their sites'
terms of use.
[55] Which of the following best describes
what the author means when he writes that changing how a service provides
information is "asking for trouble" in the 9th paragraph?
1. Making new laws to prevent digital
gerrymandering would be difficult.
2. Limiting companies' right to free speech
could have negative effects.
3. Doing so would violate the users' First
Amendment right to free speech.
4. Any such law would also apply to users'
political content.
[56] Which of the following activities
would be allowed for an information fiduciary, as described in this article?
1. Using a user's personal information to
deliver custom advertising content directly from the company.
2. Requiring a user to completely log out
of the service in order to see a generic search result or newsfeed.
3. Selling user browsing data to a business
partner for the purpose of creating a list of potential customers.
4. Collecting user location data from a
mobile application to predict income level and voting behavior.
[57] The example about subliminal messages
in the 12th paragraph is included to show which of the following?
1. Public worries about new technology's
impact on corporate speech are usually baseless.
2. There is legal precedent for prohibiting
certain kinds of corporate speech for the public good.
3. Corporate interests will always use new
technology to mislead the public for their own purposes.
4. Historically corporations have
negotiated with the public on how they can apply new technologies to speech.
[58] What does the author mean when he
states in the 12th paragraph, "The same duty of audience protection ought
to be brought to today's dominant medium"?
1. Internet services should safeguard the
public from secret manipulation.
2. Advertisements on Internet services
should conform to television standards.
3. The Federal Communications Commission
should not regulate Internet services.
4. Internet companies must be prohibited
from hosting political content.
[59] Which of the following could replace
the word "sauces" in the last sentence of the 12th paragraph?
1. Politics.
2. Agreements.
3. Methods.
4. Values.
[60] Which of the following best summarizes
the author's position on the problem of digital gerrymandering?
1. Although there is a high potential for
abuse, users have no choice but to trust Internet companies with their
information.
2. The free market will encourage Internet
companies to remain trustworthy with regards to delivering information.
3. A new legal category of business should
be established for Internet companies to protect users from unethical
practices.
4. Any attempt to limit the activities of
Internet companies will be ultimately ineffective due to the speed of
technological advancement.


コメント
コメントを投稿