When Joe Biden and Kamala Harris were sworn into office, it marked the first time in American history that Californians held two of the three highest offices in the federal government. No, President Biden is not from the Golden State, but Vice President Harris and Speaker of the House Nancy Pelosi both hail from the Bay Area. And with Attorney General Xavier Becerra holding a key cabinet position, officials from California now have a sizable role in influencing the Biden agenda.
The incoming administration is rightly prioritizing economic relief and Covid-19 vaccine deployment. On other issues, they’ll have to navigate narrow Democratic majorities in Congress, in which some progressive policies could be nonstarters. To avoid gridlock, these high-ranking Californians can identify policies with broad, bipartisan support, perhaps taking a page out of their home state’s playbook.
In recent years, California has become a national leader on privacy rights. Oakland, San Francisco, and Santa Clara County, among other municipalities, have spearheaded strong local laws to oversee governmental use of people’s private information and data.
40 civil rights and immigration groups, including Media Alliance, wrote to the Biden Administration about plans to replace physical walls with surveillance walls at the Mexico border.
The letter expressed concerns about a sharp increase in biometric data collection, immigrants taking more remote and deathly routes to avoid detection, and the use of the border for “testing” highly invasive military grade surveillance.
United to Save the Mission, an umbrella coalition of community based organizations located in SF’s Mission District wrote a letter of objection to a proposed private camera network from the Castro District Business Improvement District, joining the Harvey Milk and Alice B. Toklas Democratic Clubs and the Castro LBGTQ District in opposing the Castro cameras.
Facial recognition software has become a common part of American life. It’s used by government employment agencies to verify an applicant’s identity, by landlords to monitor tenants, and by police in their investigations, which has resulted in some wrongful arrests. Indeed, studies show that facial recognition algorithms are often inaccurate when it comes to identifying women and people with dark skin tones. Privacy advocates concerned by how law enforcement has used surveillance technology cheered Amazon’s recent decision to extend a moratorium on police use of its facial recognition software, though Amazon gave no reason why it was doing so. We’ll talk to Bay Area experts about how facial recognition technology is being used, why it needs to be closely monitored, and what cities, states and the federal government are doing — or not doing — to regulate its use.
Matt Cagle, technology and civil rights attorney with the ACLU
Brian Hofer,, chair and executive director, Secure Justice
Daniel E. Ho, Scott Professor of Law, Stanford University and also an Associate Director at Stanford’s Institute for Human-Centered Artificial Intelligence
Tracy Rosenberg, executive director, Media Alliance
The Berkeley Police Accountability Board, or PAB, discussed a successful perpetrator negotiation on Telegraph Avenue and delayed action for a proposal to expand public safety surveillance camera use in Berkeley at its regular meeting Wednesday.
During the first public comment session, Berkeley resident Kitt Saginor raised concerns about the Berkeley Police Department’s COVID-19 response in light of Berkeley’s loosening mask policies. Saginor urged officers to continue masking within the department and community after photos on social media allegedly showed maskless officers in Target during the omicron surge.
Oakland Privacy representative Tracy Rosenberg advocated against the Automated License Plate Readers policy, which uses cameras to capture vehicle license plates. Expansion of the policy would allow the use of scanning technology beyond parking enforcement, the initial purpose portrayed by City Council.
“This is essentially a breaking of a contract that was made between the City Council and the residents in Berkeley in terms of why this equipment was brought and how it was going to be used,” Rosenberg said during the meeting.
Rosenberg discouraged the board from adding uses to law enforcement equipment due to difficulties in data extraction.
Privacy and criminal justice activists across the country are focusing on gun detection software, and specifically lead vendor, Shotspotter, after the company’s forensic reports threw two innocent men in jail and drew cops into a fatal encounter with 13 year old Adam Toledo in Chicago.
The outdoor microphones, which are predominately deployed in lower income Black and Brown communities, routinely create dangerous situations by sending police alarms of gunfire that never took place. Independent research documents that around 90% of Shotspotter alerts end up with no evidence of gunfire ever having occurred.
Shotspotter has bought a predictive policing company (Hunchlab), promoted their technology’s potential as a drone activation system, and recently announced a partnership with Airobotics, a drone company based in Israel.
We need real solutions to gun violence, not routinely malfunctioning tech that is wildly expensive and drains public dollars.
If you’d like to help our coalition end Shotspotter contracts in the Bay Area and take the message to the company’s doors, watch this space.
Online Harms Need A Structural Solution: Ham-Handed Censorship Won’t Fix It
There is no doubt about it. Internet 2.0 made some people a lot of money. The quandary of the early 2000’s of how to monetize the Internet was answered by the rise of surveillance capitalism, and those positioned to grab the data in Silicon Valley have made (and in some cases lost) vast fortunes.
But as the early 2000’s receded, it became abundantly clear that the economic miracle of the monetized Internet had grave societal harms. Not just the obvious one of the institutionalization of an oligopoly of Big Tech firms who had scaled beyond any semblance of real competition, but kitchen sink harms that included the exploitation of children and youth, sexual abuse, black markets for harmful drugs and guns and the spread of virulent disinformation.
Not surprisingly, the large-scale distribution and increasing visibility of harmful content led to desires to make the “bad content” go away, some broadly recognized as such and other more ambiguously characterized as such depending on ideology.