Category Archives: general

Exodus Intelligence, exposing TAILS bugs, may be doing the best service to privacy after Snowden

Exodus Intelligence, revealing how vulnerable top FLOSS are, may be doing the greatest service to privacy since Snowden.

They are finally making clear – to all to mainstream tech writers, privacy tech tools users and developers – that software should be much more audited relative to complexity, which means large investments and/or huge much expanded volunteer participation.

Sure a zero day market should not exist and but it always will and will keep growing as it cannot be stopped. No major country will make it illegal to kit disclose a discovered zero day because every other major country would continue to stockpile them.

We are fortunate some in that market see economic convenience in releasing such info (and apparently in responsible).

The only very major objection to Exodus Intelligence is that they haven’t gone nearly far enough as there are so many potential vulnerabilities at the firmware and hardware level which they do not mention.

I’d argue they know very well given their general competencies. But, possibly they haven’t because they cannot provide any services in that area, and it is in their best interest to underestimate such threat to increase the perceived value of their software-level zero-days for defensive purposes.

Unfortunately, we may never see a similar company coming out for hw-level zero-days as it would have to be upper echelons of US state security agencies or highest-clearance execs in dominant mainstream processor and hardware makers, as well as major world foundries.

To start moving to solve those vulnerabilities we’ll have to rely on their proven feasibility, the opinion of the world-highest experts persons and bodies, and other supporting evidence. We’ll look at that in a future post.

Snowden on privacy tech solutions and code verifiability

Snowden, in an interview with the Guardian 2 days ago, talks about (1) proper privacy tech solutions and the (2) importance of verifiability and free software licensing.

Our User Verified Social Telematics project seems quite aligned with what he said.

(1) About proper privacy tech solutions he said:

“Recently, I’ve been spending a lot of time thinking about press freedom issues in addition to the ordinary individual’s private communications, and I’ve been partnering with civil liberties organisations to see where we can contribute and try to create new tools, new techniques, new technologies that will make sure our rights are protected regardless of the status of law in a given jurisdiction.

Imagine an app or a cell phone or an operating system for a cell phone or a small device, anything that would allow people to have free and ready access to meaningfully secure communications platforms that don’t require sophistication to use and operate”.

By mentioning apps, he’s clearly trying to encouraged privacy innovation at all stack levels and overall investment. Proper encryption apps would make passive super-low-cost surveillance, in transit or on the cloud, difficult or impossible.

Nonetheless, if  “meaningful” protection from low-cost semi-automated targeted surveillance (at end-points, beyond point of encryption) could be provided by an app, he wouldn’t be talking about “operating systems“. This mention clearly supports to TAILS live booting OS on the desktop (that his chosen journalists use for their communications with him), and GSMK Cryptophone phones running free software apps and Gnu/Linux OS.

Furthermore, he mentions of a “small device” instead of “mobile device” or “portable device” clearly acknowledges the difficulty in protecting from unverified baseband processors, and other issues and complexities in securing a phone. It is very likely it refers to efforts such as those of Tomy (a alpha project of the TAILS team), meant to run on Wifi-only mobile devices, or mobile device where mobile network functionality can be reliably removed. It may refer to solutions such as R&S Top Sec or Secusmart phones with microSD solutions (used by Angela Merkel), if they were verifiable (and certifiably adequately verified) in their sw and hw, and transparent in their design.

The current approach of the Tomy project may not be optimal because:

  • It’s still be vulnerable to hardware and firmware vulnerabilities, such as those of the main processor and co-processor, including the USB used and its firmware. And each device will have its own (as in Tails).
  • Not clear at all to what extent it may be possible to reliably disable the baseband processor during
  • Has the inconvenience of having to reboot every time, and works only when WiFi is available.
  • Has no strategic plan to date to attract nearly the necessary resources to develop such solution to high enough levels of assurance and promote wide adoption of that solution.

(2) On free software and verifiability he said:

I think everybody has some exposure to proprietary software in their lives, even if they’re not aware of it. Your cell phones for example are running tons and tons of proprietary code from all the different chip manufacturers and all of the different cell phone providers.

We are moving very slowly but meaningfully in the direction of free and open software that’s reviewable, or, even if you can’t do it, a community of technologists [who] can look at what these devices are really doing on the software level and say, is this secure, is this appropriate, is there anything malicious or strange in here? That increases the level of security for everybody in our communities.

I’d argue he refers to the fact that many free software users, activists and experts often underestimate the importance of proprietary firmware, which render meaningless ALL control and freedoms from snooping and tampering they believe they gain by running only free software on OS and app layers. He also makes clear that free software is preferred, but that verifiability of source code may be initially sufficient for security assessment.

Schneier and the need for bollot box type procedures like the CivicRoom

In this video Bruce Schneier (minute 33.21 till 36.00) makes direct reference to the need to deploy in-person “secret sharing” schemes inspired to ballot box voting procedures, such as the ones we have devised for the UVST CivicRoom , and we demonstrated with a physical installation in 2007 a major ICT event in Ara Pacis in Rome, in partnership with Progetto Winston Smith.

Such event was organized, as director of the Lazio Region IT Agency LAIT, by the newly-elected head of Agenda Digitale Italiana, Alessandra Poggiani. She also participated as a main speaker a few weeks later to our IPTV 2.0 event the next year.

Bruce Schneier on the need for bollot box type procedures like UVST CivicRoom

In this video Bruce Schneier (minute 33.21 till 36.00) makes direct reference to the need to deploy in-person “secret sharing” schemes inspired to ballot box voting procedures, such as the ones we have deviced for the UVST CivicRoom , and we demonstrated with a physical installation in 2007 a major ICT event in Ara Pacis in Rome, in partnership with Progetto Winston Smith.

Such event was organized, as director fo the Lazio Region IT Agency LAIT, by the newly-elected Head of Agenda Digitale Italiana, Alessandra Poggiani. She also participated as a main speaker a few weeks later to our IPTV 2.0 event the next year.

One may to resist endpoint attacks is to mix up the most valuable information in huge files, non easily and undetectably exfiltrated by attackers through low level device vulnerabilities

One the world greatest cryptologists says:

“I want the secret of the Coca-Cola company not to be kept in a tiny file of 1KB, which can be exfiltrated easily by an APT,” Shamir said. “I want that file to be 1TB, which can not be exfiltrated. I want many other ideas to be exploited to prevent an APT from operating efficiently. It’s a totally different way of thinking about the problem.”

New York Times to focus on exchanging their services for privacy, like Google and facebook

http://mobile.nytimes.com/2014/06/20/business/media/new-york-times-and-washington-post-to-develop-platform-for-readers-contributions.html?_r=0&referrer=

Everyone’s been talking for years about using the web in a better way without cheapening content, but simply adding a post by ‘anonymous’ is not a way to maintain the journalistic quality of any publication,” said Alberto Ibargüen, Knight’s chief executive. “There was a need to find a way to engage the audience in a way that enhances discussion.”

Translation: they’ll be selling your data for targeted ads, like everyone else

Firefox to deliver both DRM & user privacy? It can be done, but in a different way

http://www.theguardian.com/technology/2014/may/14/firefox-closed-source-drm-video-browser-cory-doctorow

The inclusion of Adobe’s DRM in Firefox means that Mozilla will be putting millions of its users in a position where they are running code whose bugs are illegal to report. So it’s very important that this code be as isolated as possible.

By open-sourcing the sandbox that limits the Adobe software’s access to the system, Mozilla is making it auditable and verifiable. This is a much better deal than users will get out of any of the rival browsers, like Safari, Chrome and Internet Explorer, and it is a meaningful and substantial difference.

Seems to me that Mozilla and Adobe may even be able to pull off a tech solution that concurrently guarantees user privacy rights and content owners entitlements.

Even if they did – and it may very well turn out to be an impossible task – it wouldn’t matter significantly to users’ privacy, because most software and firmware stacks below Firefox keep on being 10 or 100 times larger than what is affordably verifiable, and most firmware and physical hardware on commercial devices are not even verifiable.

Our project User Verified Social Telematics aims to do exactly that, with world class partners.

A case for a “Trustless Computing Group”

Is it possible to imagine a Trustless Computing Group that deploys the same kind hardware-level security standards deployed to-date by the (un)famous Trusted Computing Group – but (a) intrinsically user-accountable (b) severely hardened and (c) extended to manufacturing process oversight – to guarantee concurrently users privacy AND content rights owners copyrights via user-verifiable security assurance processes?

The term “trustless computing” is chosen because it concurrently mean (a) the opposite of Trusted Computing ™ – which the user can’t trust as they could not verify or analyse it, and content providers couldn’t trust as it go broken all the time – and (b) a “computing that does not require trust in any person, entity or technology”, that carries to the ultimate the proposed Trust No One model by US security expert Gibson.

The Trust Computing Group has over the last decade has deployed 2,121,475,818 devices (today’s count on their website) which contain hardware, firmware and software technologies that cannot, in their entirety, be legally (in US) and/or practically verified openly by third parties, and therefore most surely full of vulnerabilities resulting from malicious actions – by NSA and many other parties – from incompetence and/or from luck of open public oversight and testing. As history has shown.

In addition to its not sufficient trustworthiness, 2 main contradictions of Trusted Computing are still completely there and unsolved, since its inception over a decade ago:

  1. DRM (and other trusted computing) keep on getting broken. Nonetheless, content owners are fine since its technical weakness was solved by Apple and similar strategies that made their entire platforms a DRM systems (what Schneier calls feudal security model) and/or by making it impractical enough for the average user to widely consume pirated content on commercial entertainment computing devices.
  2. It’s negative impact on users privacy remains intact and unresolved. Nonetheless, it has become more and more evident to everyone over this decade – and even more since Snowden – that the hardware and software technologies we use are so vulnerable or broken – and the business model of most B2C cloud services so catastrophic for user – that DRM is rightfully perceived as just one more of so many many vulnerabilities that are there already, and therefore not worth fighting against.

This week, Trust Computing Group claimed that their model is the right model “to solve today’ most urgent cybersecurity problems” such as those that have emerged since Snowden revelations, as for example those caused by vulnerabilities in widely used critical free software like OpenSSL.

Of course, this must be a joke, since the most urgent cyber security needs actual security of end-to-end systems to protect against security and privacy breaches that can cause grave damage to citizens or state agencies, and not failed technologies standards that have been the prime movers of hardware-level security-through-obscurity paradigms, that has produced what we are know discovering as a completely broken computing industry where commercial computing is way more complex that it can ever be assured for security, and vulnerabilities abound in all devices hardware and software levels, with the high probability that a significant number of actors in any nation, and not just NSA, has access to many of them.

Now, what?

What if instead we flipped it over, and created a standard body named Trustless Computing Group based on free software and hardware-based security-through-transparency paradigm, that would use the same user-verifiable processes to guarantee (1) unprecedented privacy and freedom to user, and (2) unprecedented security to the content owner!? Why can’t the same assurance socio-technical processes guarantee both users data and content owners data?!

That’s what we are aiming at with the User Verified Social Telematics project and related draft campaign for international standard and campaign for governmental legislations promoting it.

Alternative names for it:

Trustless Computing?!

Trustless Telematics!?

Verified Telematics!?

User Verified Telematics?!

Transparent Telematics!?

Got any suggestions? …

Nov 24th UPDATE : (1) Some typos and non clear passages have been revised. We have started setting up such consortium, although it is temporarily called User Verified Social Telematics Consortium.

Don’t be fooled, a way out of hardware backdoors exists!

This latest 60 seconds video excerpt (32.40-34.00) by Bruce Schneier, and this oct 2013 MIT Review article, show how extremely complex, widespread and probable is the problem of firmware or physical backdoors inserted in extremely widely-used hw components, during device manufacturing process.

That is only expected to get worse as, post-Snowden, both illegal or unconstitutional spying by state and non-state entities will increasingly have to rely on expanding the capabilities of automated-targeted critical exploitation of millions of end-user device, as most internet traffic and data will be encrypted, and most widely used software for encryption and onion routing gets improved and hardened for security.

Schneier and the MIT article author implicitly or explicitly state there is nothing that can be done to assure users in regards to their safety against such huge current threat.

I believe Schneier is wrong by saying that there is nothing to do or, better, I think he really meant to say there is nothing to do if we want the type of feature and application richness we are used to with today’s mainstream commercial computing, as they are either in cahoots with one or more national governments and/or their complexity is way beyond the ability of anyone to verify them adequately.

The solution is simply to simplify!

The solution is hinted at in a statement by the DARPA representative in the mentioned MIT article, when he said:

DoD relies on millions of devices to bring network access and functionality to its users. Rigorously vetting software and firmware in each and every one of them is beyond our present capabilities, and the perception that this problem is simply unapproachable is widespread.

I’d argue that what he really means when he talks about the large number of DoD devices, is not really their number in units, but really the number of different DoD devices and the complexity of many or most of such DoD them of them.

That makes sense since, given very large but still limited budgets, to have complete verifiability and adequate verification of every hw component on a given device can be done for a few targeted, and extremely simple, hw platforms, albeit with huge upfront cost ( and relatively very low marginal costs).

The same process would instead be hugely costly, or effectively impossible, for more complex devices that rely on large number of complex components from many different third parties, where adequate access to manufacturing processes oversight may be hugely costly or impossible – even through enhanced versions of programs such as the DoD Trusted Foundries Program (TFP) – for obstacles due to matters of IP protection, corporate choices or national security agencies interests of the nation hosting the fab.

The solution is therefore to focus limited resources (high 1 or low 2 digit $ millions) of a an international joint venture of private, no-profit and ethical hacker communities (supported by private funds, partnering IT companies, state and foundation grants) on a single minimal hw platform (or SoC). Such platform: is suitable – albeit with very severe performance and functional limitations– for server, router and handheld end-user device for basic communications; has extreme simplicity of features, hardware and software; and has complete verifiability and enacts adequately-extreme and open verification.

The resulting levels of assurance and consequent value to ordinary users and to ultra-critical users would produce large revenues, to gradually expand capabilities and features, without reducing and possibly increasing the assurance level. User Verified Social Telematics.

India eyes to become world leader of privacy-enhnancing technologies

CIS INdia, the leading IT rights think tank in India, is proposing that India bet on world class IT privacy as a key competitive advantage for it’s IT industry. They are seeing how, the fullest protection of digital human rights can, in a post-Snowden world, can become the primary competitive advantage of the IT industry of an entire nation.

We pointed that opportunity during our event in Rome with Richard Stallman, named “Full realisation of citizen digital rights as huge economic opportunity for the Lazio Region“.

Italy could do that as well – or even before India or Brazil, or Switzerland – as we are proposing with our Open Media District project, that includes a dedicated technology park, dedicated supporting legislations and large trail blazing R&D projects, such as User Verified Social Telematics.

Here’s what CIS envisions:

Post-Snowden, the so called swing states occupy the higher moral ground. It is time for these states to capitalize on this moment using strong political will. Instead of just being a friendly jurisdiction from the perspective of access to medicine, it is time for India to also be the enabling jurisdiction for access to knowledge more broadly. We could use patent pools and compulsory licensing to provide affordable and innovative digital hardware [especially mobile phones] to the developing world. This would ensure that rights-holders, innovators, manufactures, consumers and government would all benefit from India going beyond being the pharmacy of the world to becoming the electronics store of the world. We could explore flat-fee licensing models like a broadband copyright cess or levy to ensure that users get content [text, images, video, audio, games and software] at affordable rates and rights-holders get some royalty from all Internet users in India. This will go a long way in undermining the copyright enforcement based censorship regime that has been established by the US. When it comes to privacy – we could enact a world-class privacy law and establish an independent, autonomous and proactive privacy commissioner who will keep both private and state actors on a short lease. Then we need a scientific, targeted surveillance regime that is in compliance with human rights principles. This will make India simultaneously an IP and privacy haven and thereby attract huge investment from the private sector, and also earn the goodwill of global civil society and independent media. Given that privacy is a precondition for security, this will also make India very secure from a cyber security perspective. Of course this is a fanciful pipe dream given our current circumstances but is definitely a possible future for us as a nation to pursue.

 

 

How Tails could bring privacy to all with 8M euros

Tails, the free software USB Gnu/Linux  OS, is reportedly used by Snowden and Schneier as their main secure desktop platform. 

It’s definitely a major step ahead respect to everything else. But, aside from its poor usability and availability only for PC, does it provide nearly enough privacy and security after the what has come out in the last year?!

I see major potential critical vulnerabilities (to scalable remote exploitation coming) from:
-way too large OS and apps, even if severely stripped down and hardened
-not nearly enough expert verification per quantity of code
-no public background checks on contributors and lead developers and architects (which are anonymous)
-users ‘firmware
-users’ hardware
-Tor network vulnerabilities due to: traffic analysis, bug in poorly verified floss code (such as OpenSSL), low number of expected non-malicious and competently-managed nodes.

I imagine Snowden and Schneier protect from these through setups and configurations, rules of behavior,  .. But such tricks require very high skills, shared by your communications interlocutor, and they drive usability even lower.

We at the Open Media Cluster believe to have identified a solution to such vulnerabilities and usability problems of Tails (and similar), that could cost under 8M€ of R&D to build and test, and be made affordable and usable by any Western citizen, as a parallel environment for secure computing.

It involves modifying Tails by:
-stripping it down to very basic features
-embedding it in a barebone 3mm touch screen device with hdmi out (to display on your desktop monitor) and bluetooth (to go on the Net via your phone), that can be attached to the back of any phone via a hard case.
-adding very very thorough (relative to quantity of code) and open verification to all software and firmware
-add manufacturing process oversight exceeding in user-verifiability the US DoD “Trusted Foundry Program”
-improve Tor security and performance through traffic spoofing techniques, direct incentives for non-malicious and properly configured nodes, and very extensive Tor code review
-a few more tricks

See more at the User Verified Social Telematics project.

Facebook Ceo thinks privacy rights should not exist

http://m.thedrum.com/news/2014/04/19/larry-page-dreams-place-no-privacy-laws-axel-springer-ceo-claims-open-letter-google

Describing how the founder responded to a question about Facebook storing data, Dopfner wrote: “Zuckerberg said: ‘I do not understand your question. Those who have nothing to hide, have nothing to fear.’

“Again and again I had to think about this sentence. It’s terrible. I know it is certainly not meant that way. This is a mindset that was fostered in totalitarian regimes not in liberal societies. Such a sentence could also be said by the head of the Stasi or other intelligence service or a dictatorship.”

Consequences of letting mobile Apps marginalize the Web

The decline of the mobile web

The likely end state is the web becomes a niche product used for things like 1) trying a service before you download the app, 2) consuming long tail content (e.g. link to a niche blog from Twitter or Facebook feed).

This will hurt long-term innovation for a number of reasons:

1) Apps have a rich-get-richer dynamic that favors the status quo over new innovations. Popular apps get home screen placement, get used more, get ranked higher in app stores, make more money, can pay more for distribution, etc. The end state will probably be like cable TV – a few dominant channels/apps that sit on users’ home screens and everything else relegated to lower tiers or irrelevance.

2) Apps are heavily controlled by the dominant app stores owners, Apple and Google. Google and Apple control what apps are allowed to exist, how apps are built, what apps get promoted, and charge a 30% tax on revenues.

Is this battle only global, or is possible to win or partially win this battle within a continent or a single nation where major public and private actors gather to defend the Web?

Open SSL, Heartbleed and the need for minimal but truly trustable telematics

Here’s from the company developing the number 2 password manager in the world, about what you should do after the Open SSL Heartbleed vulnerability:

http://blog.agilebits.com/2014/04/12/1password-heartbleed-and-you/

The best advice I can give you is to change your most important website passwords immediately, including your email, bank accounts, and other high value targets. This will provide your best defense against previous attacks.

After a few weeks, websites will have been upgraded with new SSL certificates, and you will be able to trust SSL again. At this point you should change all of your passwords again.

The insecurity of our current IT infrastructure, devices and service is so ridiculously widespread that the only solution is to develop a parallel minimal but truly trustable, verifiable and extensively verified telematics infrastructure (devices, software, server-side equipment and process).

Here it is: User Verified Social Telematics.

And that should also become an international standard, and made a law for very sensitive e-government services, such as in the Lazio Region, to be extended to the Italian national level … here’s our campaign.

US Government Funded Your Favorite ‘NSA-Proof’ Apps

http://revolution-news.com/us-government-funds-favorite-nsa-proof-apps/

If the Open Technology Fund had never published the projects that they sponsor, their true funding sources may have never been known. The most commonly used open source license still does not require any financial disclosure at all. Which ultimately leads to a question: who else is the US government funding?

Total and user-verifiable financing transparency should be one of the necessary requirements of any future state-of-the-art digital privacy IT solution..