Category Archives: work2

Avoiding DRM support in the Web will not reduce DRM, nor noticeably increase user privacy

Deprived of the ability to use browser plugins, protected content distributors are not, in general, switching to unprotected media. Instead, they’re switching away from the Web entirely. Want to send DRM-protected video to an iPhone? “There’s an app for that.” Native applications on iOS, Android, Windows Phone, and Windows 8 can all implement DRM, with some platforms, such as Android and Windows 8, even offering various APIs and features to assist this.

http://arstechnica.com/business/2013/05/drm-in-html5-is-a-victory-for-the-open-web-not-a-defeat/

In addition, having DRM sw or hw on a device, even with loads free software, just adds to the user one more of very many sw, firmware or hw security holes, and therefore does not noticeably increase  user privacy  levels.
Furthermore DRM, can potentially be deployed in a user verifiable and verified way, except in the US where unauthorized verification in illegal.

“Google Has Most of My Email Because It Has All of Yours | copyrighteous”

http://mako.cc/copyrighteous/google-has-most-of-my-email-because-it-has-all-of-yours

Despite the fact that I spend hundreds of dollars a year and hours of work to host my own email server, Google has about half of my personal email! Last year, Google delivered 57% of the emails in my inbox that I replied to. They have delivered more than a third of all the email I’ve replied to ever year since 2006 and more than half since 2010.

Morozov on privacy techs

http://mobile.nytimes.com/2012/10/14/books/review/this-machine-kills-secrets-by-andy-greenberg.html?pagewanted=all

Even with regards to the leakers, however, the situation is far more complex than Greenberg lets on. He draws elaborate comparisons between the cases of Bradley Manning and Daniel Ellsberg, arguing that digital technologies have expanded the scale and the speed of leaking and made it easier to cover the tracks. But have we entered a truly new era, in which technology provides a robust infrastructure for leaking — a common techno-­optimistic view advanced in many books about WikiLeaks? Or is the whole Cablegate episode just a blip in the long institutional march toward even greater secrecy — perhaps an instanceof governments and corporations not taking their network security seriously but hardly a guarantee that they won’t adapt in due time?

The idea of transparent society synthesized

http://open.salon.com/blog/david_brin/2013/12/04/the_ongoing_privacy_problem_other_voices

In an article, Privacy is Dead; Long Live Transparency, Kevin Drum writes, “I call this the ‘David Brin question,” after the science fiction writer who argued in 1996 that the issue isn’t whether surveillance will become ubiquitous — given technological advances, it will — but how we choose to live with it. Sure, he argued, we may pass laws to protect our privacy, but they’ll do little except ensure that surveillance is hidden ever more deep and is available only to governments and powerful corporations. Instead, Brin suggests, we should all tolerate less privacy, but insist on less of it for everyone. With the exception of a small sphere within our homes, we should accept that our neighbors will know pretty much everything about us and vice versa. And we should demand that all surveillance data be public, with none restricted to governments or data brokers. Give everyone access to the NSA’s records. Give everyone access to all the video cameras that dot our cities. Give everyone access to corporate databases.”

Vulnerability exploitation by rent or hire

The Russian underground market is consolidating the model of sale known as malware-as-a-service, a growing number of illicit products and hacking activities are available for rent. Like every market, also Russian underground has its own specialty, the sale of TDSs and traffic direction and PPI services.

This model allows many more entities to have access to a given vulnerability without the risk of the vulnerability becoming known and therefore potentially fixed in the near term.

Historical Mozilla CTO and inventor of Javascript on the in existent security of major Web browsers

https://brendaneich.com/2014/01/trust-but-verify/

" Every major browser today is distributed by an organization within reach of surveillance laws. As the Lavabit case suggests, the government may request that browser vendors secretly inject surveillance code into the browsers they distribute to users. We have no information that any browser vendor has ever received such a directive. However, if that were to happen, the public would likely not find out due to gag orders."

What happened with TrueCrypt? and why?

Most plausible explanation of what happened with TrueCrypt:https://news.ycombinator.com/item?id=7814725

The whole message on the site makes no sense and I think that’s on purpose. What likely happened is the US gov found the TC authors, then used their weight to try and get them to back door the binaries. Authors didn’t want to, but couldn’t publicize the letters without going to jail, so they made up the most ridiculous story for why they were giving up on the project, the best possible outcome so that they wouldn’t go to jail and wouldn’t subject users to the required back door.

Great comments in Schneier blog:

d. Another line of thought goes like this: If the NSA really really really wants to know WHAT Snowden had access to, and wanted to say, use a tempest solution to grab that information, one way to to that would be to spook someone known to have received those info using psy_ops to persuade that someone to decrypt the entire data from whatever air-gapped machine is on into some other machine. Which is a simply way to suggest that Bruce, Greenwald, et al. ought to review personal security and NOT be spooked into spinning up the NSA archives and trying to migrate that data. Put those laptops under lock and key and don’t use them a few days. Don’t run off an mass migrate those archives just yet.

There is a ticket to remove truecrypt from tails dated at the latest May 19th.
https://tails.boum.org/blueprint/replace_truecrypt/
Considering Jacob Applebaum has 1) worked on the Snowden files and 2) is involved in tails and 3) tor and 4) tails seems to have had advanced warning I am putting my hands down that this is connected.

bae24d3fffMay 29, 2014 9:06 AM
I just want to mention that this has wiped out the TrueCrypt forum too.
There were hundreds of users at the TC forum (myself included), which contained a goldmine of information, not just about TrueCrypt itself but also crypto and computer security in general.
Many people put in many hours of work in the forum, and it would seem that that repository of knowledge is gone at a stroke.

Firefox to deliver both DRM & user privacy? It can be done, but in a different way

http://www.theguardian.com/technology/2014/may/14/firefox-closed-source-drm-video-browser-cory-doctorow

The inclusion of Adobe’s DRM in Firefox means that Mozilla will be putting millions of its users in a position where they are running code whose bugs are illegal to report. So it’s very important that this code be as isolated as possible.

By open-sourcing the sandbox that limits the Adobe software’s access to the system, Mozilla is making it auditable and verifiable. This is a much better deal than users will get out of any of the rival browsers, like Safari, Chrome and Internet Explorer, and it is a meaningful and substantial difference.

Seems to me that Mozilla and Adobe may even be able to pull off a tech solution that concurrently guarantees user privacy rights and content owners entitlements.

Even if they did – and it may very well turn out to be an impossible task – it wouldn’t matter significantly to users’ privacy, because most software and firmware stacks below Firefox keep on being 10 or 100 times larger than what is affordably verifiable, and most firmware and physical hardware on commercial devices are not even verifiable.

Our project User Verified Social Telematics aims to do exactly that, with world class partners.

A case for a “Trustless Computing Group”

Is it possible to imagine a Trustless Computing Group that deploys the same kind hardware-level security standards deployed to-date by the (un)famous Trusted Computing Group – but (a) intrinsically user-accountable (b) severely hardened and (c) extended to manufacturing process oversight – to guarantee concurrently users privacy AND content rights owners copyrights via user-verifiable security assurance processes?

The term “trustless computing” is chosen because it concurrently mean (a) the opposite of Trusted Computing ™ – which the user can’t trust as they could not verify or analyse it, and content providers couldn’t trust as it go broken all the time – and (b) a “computing that does not require trust in any person, entity or technology”, that carries to the ultimate the proposed Trust No One model by US security expert Gibson.

The Trust Computing Group has over the last decade has deployed 2,121,475,818 devices (today’s count on their website) which contain hardware, firmware and software technologies that cannot, in their entirety, be legally (in US) and/or practically verified openly by third parties, and therefore most surely full of vulnerabilities resulting from malicious actions – by NSA and many other parties – from incompetence and/or from luck of open public oversight and testing. As history has shown.

In addition to its not sufficient trustworthiness, 2 main contradictions of Trusted Computing are still completely there and unsolved, since its inception over a decade ago:

  1. DRM (and other trusted computing) keep on getting broken. Nonetheless, content owners are fine since its technical weakness was solved by Apple and similar strategies that made their entire platforms a DRM systems (what Schneier calls feudal security model) and/or by making it impractical enough for the average user to widely consume pirated content on commercial entertainment computing devices.
  2. It’s negative impact on users privacy remains intact and unresolved. Nonetheless, it has become more and more evident to everyone over this decade – and even more since Snowden – that the hardware and software technologies we use are so vulnerable or broken – and the business model of most B2C cloud services so catastrophic for user – that DRM is rightfully perceived as just one more of so many many vulnerabilities that are there already, and therefore not worth fighting against.

This week, Trust Computing Group claimed that their model is the right model “to solve today’ most urgent cybersecurity problems” such as those that have emerged since Snowden revelations, as for example those caused by vulnerabilities in widely used critical free software like OpenSSL.

Of course, this must be a joke, since the most urgent cyber security needs actual security of end-to-end systems to protect against security and privacy breaches that can cause grave damage to citizens or state agencies, and not failed technologies standards that have been the prime movers of hardware-level security-through-obscurity paradigms, that has produced what we are know discovering as a completely broken computing industry where commercial computing is way more complex that it can ever be assured for security, and vulnerabilities abound in all devices hardware and software levels, with the high probability that a significant number of actors in any nation, and not just NSA, has access to many of them.

Now, what?

What if instead we flipped it over, and created a standard body named Trustless Computing Group based on free software and hardware-based security-through-transparency paradigm, that would use the same user-verifiable processes to guarantee (1) unprecedented privacy and freedom to user, and (2) unprecedented security to the content owner!? Why can’t the same assurance socio-technical processes guarantee both users data and content owners data?!

That’s what we are aiming at with the User Verified Social Telematics project and related draft campaign for international standard and campaign for governmental legislations promoting it.

Alternative names for it:

Trustless Computing?!

Trustless Telematics!?

Verified Telematics!?

User Verified Telematics?!

Transparent Telematics!?

Got any suggestions? …

Nov 24th UPDATE : (1) Some typos and non clear passages have been revised. We have started setting up such consortium, although it is temporarily called User Verified Social Telematics Consortium.

Don’t be fooled, a way out of hardware backdoors exists!

This latest 60 seconds video excerpt (32.40-34.00) by Bruce Schneier, and this oct 2013 MIT Review article, show how extremely complex, widespread and probable is the problem of firmware or physical backdoors inserted in extremely widely-used hw components, during device manufacturing process.

That is only expected to get worse as, post-Snowden, both illegal or unconstitutional spying by state and non-state entities will increasingly have to rely on expanding the capabilities of automated-targeted critical exploitation of millions of end-user device, as most internet traffic and data will be encrypted, and most widely used software for encryption and onion routing gets improved and hardened for security.

Schneier and the MIT article author implicitly or explicitly state there is nothing that can be done to assure users in regards to their safety against such huge current threat.

I believe Schneier is wrong by saying that there is nothing to do or, better, I think he really meant to say there is nothing to do if we want the type of feature and application richness we are used to with today’s mainstream commercial computing, as they are either in cahoots with one or more national governments and/or their complexity is way beyond the ability of anyone to verify them adequately.

The solution is simply to simplify!

The solution is hinted at in a statement by the DARPA representative in the mentioned MIT article, when he said:

DoD relies on millions of devices to bring network access and functionality to its users. Rigorously vetting software and firmware in each and every one of them is beyond our present capabilities, and the perception that this problem is simply unapproachable is widespread.

I’d argue that what he really means when he talks about the large number of DoD devices, is not really their number in units, but really the number of different DoD devices and the complexity of many or most of such DoD them of them.

That makes sense since, given very large but still limited budgets, to have complete verifiability and adequate verification of every hw component on a given device can be done for a few targeted, and extremely simple, hw platforms, albeit with huge upfront cost ( and relatively very low marginal costs).

The same process would instead be hugely costly, or effectively impossible, for more complex devices that rely on large number of complex components from many different third parties, where adequate access to manufacturing processes oversight may be hugely costly or impossible – even through enhanced versions of programs such as the DoD Trusted Foundries Program (TFP) – for obstacles due to matters of IP protection, corporate choices or national security agencies interests of the nation hosting the fab.

The solution is therefore to focus limited resources (high 1 or low 2 digit $ millions) of a an international joint venture of private, no-profit and ethical hacker communities (supported by private funds, partnering IT companies, state and foundation grants) on a single minimal hw platform (or SoC). Such platform: is suitable – albeit with very severe performance and functional limitations– for server, router and handheld end-user device for basic communications; has extreme simplicity of features, hardware and software; and has complete verifiability and enacts adequately-extreme and open verification.

The resulting levels of assurance and consequent value to ordinary users and to ultra-critical users would produce large revenues, to gradually expand capabilities and features, without reducing and possibly increasing the assurance level. User Verified Social Telematics.

How Tails could bring privacy to all with 8M euros

Tails, the free software USB Gnu/Linux  OS, is reportedly used by Snowden and Schneier as their main secure desktop platform. 

It’s definitely a major step ahead respect to everything else. But, aside from its poor usability and availability only for PC, does it provide nearly enough privacy and security after the what has come out in the last year?!

I see major potential critical vulnerabilities (to scalable remote exploitation coming) from:
-way too large OS and apps, even if severely stripped down and hardened
-not nearly enough expert verification per quantity of code
-no public background checks on contributors and lead developers and architects (which are anonymous)
-users ‘firmware
-users’ hardware
-Tor network vulnerabilities due to: traffic analysis, bug in poorly verified floss code (such as OpenSSL), low number of expected non-malicious and competently-managed nodes.

I imagine Snowden and Schneier protect from these through setups and configurations, rules of behavior,  .. But such tricks require very high skills, shared by your communications interlocutor, and they drive usability even lower.

We at the Open Media Cluster believe to have identified a solution to such vulnerabilities and usability problems of Tails (and similar), that could cost under 8M€ of R&D to build and test, and be made affordable and usable by any Western citizen, as a parallel environment for secure computing.

It involves modifying Tails by:
-stripping it down to very basic features
-embedding it in a barebone 3mm touch screen device with hdmi out (to display on your desktop monitor) and bluetooth (to go on the Net via your phone), that can be attached to the back of any phone via a hard case.
-adding very very thorough (relative to quantity of code) and open verification to all software and firmware
-add manufacturing process oversight exceeding in user-verifiability the US DoD “Trusted Foundry Program”
-improve Tor security and performance through traffic spoofing techniques, direct incentives for non-malicious and properly configured nodes, and very extensive Tor code review
-a few more tricks

See more at the User Verified Social Telematics project.

Facebook Ceo against constitutional rights

http://readwrite.com/2014/04/11/facebook-privacy-controls-hand-them-over#awesm=~oBXuMo1qypNZF8

In fact, Mark Zuckerberg famously said, “Having two identities for yourself is an example of a lack of integrity.”

Our Constitutions prescribe a right to privacy of communications and of vote (and therefore of political opinions) in order to protect the freedom of speech, association and participation of that large part of the population that cannot always say what they think.

By eliminating privacy we eliminate (what’s left of) democracy.

Consequences of letting mobile Apps marginalize the Web

The decline of the mobile web

The likely end state is the web becomes a niche product used for things like 1) trying a service before you download the app, 2) consuming long tail content (e.g. link to a niche blog from Twitter or Facebook feed).

This will hurt long-term innovation for a number of reasons:

1) Apps have a rich-get-richer dynamic that favors the status quo over new innovations. Popular apps get home screen placement, get used more, get ranked higher in app stores, make more money, can pay more for distribution, etc. The end state will probably be like cable TV – a few dominant channels/apps that sit on users’ home screens and everything else relegated to lower tiers or irrelevance.

2) Apps are heavily controlled by the dominant app stores owners, Apple and Google. Google and Apple control what apps are allowed to exist, how apps are built, what apps get promoted, and charge a 30% tax on revenues.

Is this battle only global, or is possible to win or partially win this battle within a continent or a single nation where major public and private actors gather to defend the Web?

Open SSL, Heartbleed and the need for minimal but truly trustable telematics

Here’s from the company developing the number 2 password manager in the world, about what you should do after the Open SSL Heartbleed vulnerability:

http://blog.agilebits.com/2014/04/12/1password-heartbleed-and-you/

The best advice I can give you is to change your most important website passwords immediately, including your email, bank accounts, and other high value targets. This will provide your best defense against previous attacks.

After a few weeks, websites will have been upgraded with new SSL certificates, and you will be able to trust SSL again. At this point you should change all of your passwords again.

The insecurity of our current IT infrastructure, devices and service is so ridiculously widespread that the only solution is to develop a parallel minimal but truly trustable, verifiable and extensively verified telematics infrastructure (devices, software, server-side equipment and process).

Here it is: User Verified Social Telematics.

And that should also become an international standard, and made a law for very sensitive e-government services, such as in the Lazio Region, to be extended to the Italian national level … here’s our campaign.

US Government Funded Your Favorite ‘NSA-Proof’ Apps

http://revolution-news.com/us-government-funds-favorite-nsa-proof-apps/

If the Open Technology Fund had never published the projects that they sponsor, their true funding sources may have never been known. The most commonly used open source license still does not require any financial disclosure at all. Which ultimately leads to a question: who else is the US government funding?

Total and user-verifiable financing transparency should be one of the necessary requirements of any future state-of-the-art digital privacy IT solution..