Category Archives: work1

If US and EU intelligence parliamentarians are ALL spied, how can we trust the best civilian privacy solutions out there?!

If computing devices of US Senate Intelligence Committee can be undetectably spied upon for unknown amount of time, and Snowden –  as he sweared – could read the emails of any member of the Eu Espionage parliamentary committee, then is anyone safe in the civilian world?

How many people and actors can have such access in illegal ways?

How likely is it for abuse to be discovered by external reviews?

How can we ever estimate that? Can such actors also undetectably tamper placing false evidence?

Blackphone “idea” of transparency, and media buy in

Blackphone CTO prides of their transparency while stating they will never “release” all their code for review, nor tell their customers when a critical bug may have been discovered. Also, they do not even mention firmware or hardware schematics, nor they clarify which code form third party they use that will not be available for review:

I welcome any and all discussion but the immutable constraint is this: we will do testing, we will publish a Transparency Report reflecting an honest view of the results, and we will use this data as evidence of due diligence in support of our objectives of security and privacy.

It doesn’t mean we can share absolutely everything, and it doesn’t mean we’ll release information the instant we receive it. For business or other reasons we may choose to hang onto certain things until after we’ve implemented fixes, but our Chief Security Officer’s team will be responsible for managing this line of communication and keeping the world informed of whatever we can share.

Blackphone and the IT security media

Months after its launch, and no code released (not to mention firmware or hardware schematics or fab oversight), the only people that question how in the world we can even assess it’s security are a few blog commenters, while everyone from Schneier down just cheers up for the secure phone or stay silent.
We clearly have a problem of competence and one of political correctness of long time it security experts not wanting to criticize head on their pal Zimmermann.

Here a few comments on Slashdot that point to the obvious:

Still Secret Source? (+4, Insightful)
bill_mcgonigle 2 days ago
Blackphone is the “you can’t look at it, but trust us” self-proclaimed “security” company, right? And it’s easily exploitable?
Dog-bites-man story.

Re: Still Secret Source? (+4, Insightful)
chihowa 2 days ago
It’s one reason why I can’t rally behind Phil Zimmerman, as much as I like PGP and appreciate much of what he’s done. His insistence on keeping security software secretive and closed source, while seeming to understand the concept of trust, is baffling.

Re: Still Secret Source?
Anonymous Coward 2 days ago
Indeed. If you are going to write software that can secure something it should be solid enough that be able to view the code doesn’t allow someone to just punch holes right through said security. Security through obscurity is something even Microsoft has learned doesn’t work so why is this champion of secure computing trying to push it

A case for UVST in my “The economics of meaningful assurance of computing services for civilian use” lecture slides

On Aug 8th 2014 in Trento, Italy, Open Media Cluster Director Dr. Rufo Guerreschi was invited and honored by Jovan Golic – the PEU EIT ICT LABS Privacy, Security and Trust Action Line Leader of the €3 billion EU R&D agency – to hold the (only) Concluding Guest Lecture to over 50 post-graduate students selected for their prestigious EU EIT ICT Labs “Security and Privacy in Digital Life” Summer School.

During the 90 minutes of the presentation, name “The economics of meaningful assurance of computing services for civilian use”, he argued the limited costs, public benefits and technical feasibility of the creation of computing services (and devices) with meaningfully-high security and privacy assurance for wide-scale civilian deployment, such as those we’ve been pursuing with our User Verified Social Telematics project, with over 15 Italian, EU and Brazilian partners.

Here a copy of the slides (odtpdf), or here in Slideshare:

Exodus Intelligence, exposing TAILS bugs, may be doing the best service to privacy after Snowden

Exodus Intelligence, revealing how vulnerable top FLOSS are, may be doing the greatest service to privacy since Snowden.

They are finally making clear – to all to mainstream tech writers, privacy tech tools users and developers – that software should be much more audited relative to complexity, which means large investments and/or huge much expanded volunteer participation.

Sure a zero day market should not exist and but it always will and will keep growing as it cannot be stopped. No major country will make it illegal to kit disclose a discovered zero day because every other major country would continue to stockpile them.

We are fortunate some in that market see economic convenience in releasing such info (and apparently in responsible).

The only very major objection to Exodus Intelligence is that they haven’t gone nearly far enough as there are so many potential vulnerabilities at the firmware and hardware level which they do not mention.

I’d argue they know very well given their general competencies. But, possibly they haven’t because they cannot provide any services in that area, and it is in their best interest to underestimate such threat to increase the perceived value of their software-level zero-days for defensive purposes.

Unfortunately, we may never see a similar company coming out for hw-level zero-days as it would have to be upper echelons of US state security agencies or highest-clearance execs in dominant mainstream processor and hardware makers, as well as major world foundries.

To start moving to solve those vulnerabilities we’ll have to rely on their proven feasibility, the opinion of the world-highest experts persons and bodies, and other supporting evidence. We’ll look at that in a future post.

Schneier and the need for bollot box type procedures like the CivicRoom

In this video Bruce Schneier (minute 33.21 till 36.00) makes direct reference to the need to deploy in-person “secret sharing” schemes inspired to ballot box voting procedures, such as the ones we have devised for the UVST CivicRoom , and we demonstrated with a physical installation in 2007 a major ICT event in Ara Pacis in Rome, in partnership with Progetto Winston Smith.

Such event was organized, as director of the Lazio Region IT Agency LAIT, by the newly-elected head of Agenda Digitale Italiana, Alessandra Poggiani. She also participated as a main speaker a few weeks later to our IPTV 2.0 event the next year.

Bruce Schneier on the need for bollot box type procedures like UVST CivicRoom

In this video Bruce Schneier (minute 33.21 till 36.00) makes direct reference to the need to deploy in-person “secret sharing” schemes inspired to ballot box voting procedures, such as the ones we have deviced for the UVST CivicRoom , and we demonstrated with a physical installation in 2007 a major ICT event in Ara Pacis in Rome, in partnership with Progetto Winston Smith.

Such event was organized, as director fo the Lazio Region IT Agency LAIT, by the newly-elected Head of Agenda Digitale Italiana, Alessandra Poggiani. She also participated as a main speaker a few weeks later to our IPTV 2.0 event the next year.

Firefox to deliver both DRM & user privacy? It can be done, but in a different way

http://www.theguardian.com/technology/2014/may/14/firefox-closed-source-drm-video-browser-cory-doctorow

The inclusion of Adobe’s DRM in Firefox means that Mozilla will be putting millions of its users in a position where they are running code whose bugs are illegal to report. So it’s very important that this code be as isolated as possible.

By open-sourcing the sandbox that limits the Adobe software’s access to the system, Mozilla is making it auditable and verifiable. This is a much better deal than users will get out of any of the rival browsers, like Safari, Chrome and Internet Explorer, and it is a meaningful and substantial difference.

Seems to me that Mozilla and Adobe may even be able to pull off a tech solution that concurrently guarantees user privacy rights and content owners entitlements.

Even if they did – and it may very well turn out to be an impossible task – it wouldn’t matter significantly to users’ privacy, because most software and firmware stacks below Firefox keep on being 10 or 100 times larger than what is affordably verifiable, and most firmware and physical hardware on commercial devices are not even verifiable.

Our project User Verified Social Telematics aims to do exactly that, with world class partners.

A case for a “Trustless Computing Group”

Is it possible to imagine a Trustless Computing Group that deploys the same kind hardware-level security standards deployed to-date by the (un)famous Trusted Computing Group – but (a) intrinsically user-accountable (b) severely hardened and (c) extended to manufacturing process oversight – to guarantee concurrently users privacy AND content rights owners copyrights via user-verifiable security assurance processes?

The term “trustless computing” is chosen because it concurrently mean (a) the opposite of Trusted Computing ™ – which the user can’t trust as they could not verify or analyse it, and content providers couldn’t trust as it go broken all the time – and (b) a “computing that does not require trust in any person, entity or technology”, that carries to the ultimate the proposed Trust No One model by US security expert Gibson.

The Trust Computing Group has over the last decade has deployed 2,121,475,818 devices (today’s count on their website) which contain hardware, firmware and software technologies that cannot, in their entirety, be legally (in US) and/or practically verified openly by third parties, and therefore most surely full of vulnerabilities resulting from malicious actions – by NSA and many other parties – from incompetence and/or from luck of open public oversight and testing. As history has shown.

In addition to its not sufficient trustworthiness, 2 main contradictions of Trusted Computing are still completely there and unsolved, since its inception over a decade ago:

  1. DRM (and other trusted computing) keep on getting broken. Nonetheless, content owners are fine since its technical weakness was solved by Apple and similar strategies that made their entire platforms a DRM systems (what Schneier calls feudal security model) and/or by making it impractical enough for the average user to widely consume pirated content on commercial entertainment computing devices.
  2. It’s negative impact on users privacy remains intact and unresolved. Nonetheless, it has become more and more evident to everyone over this decade – and even more since Snowden – that the hardware and software technologies we use are so vulnerable or broken – and the business model of most B2C cloud services so catastrophic for user – that DRM is rightfully perceived as just one more of so many many vulnerabilities that are there already, and therefore not worth fighting against.

This week, Trust Computing Group claimed that their model is the right model “to solve today’ most urgent cybersecurity problems” such as those that have emerged since Snowden revelations, as for example those caused by vulnerabilities in widely used critical free software like OpenSSL.

Of course, this must be a joke, since the most urgent cyber security needs actual security of end-to-end systems to protect against security and privacy breaches that can cause grave damage to citizens or state agencies, and not failed technologies standards that have been the prime movers of hardware-level security-through-obscurity paradigms, that has produced what we are know discovering as a completely broken computing industry where commercial computing is way more complex that it can ever be assured for security, and vulnerabilities abound in all devices hardware and software levels, with the high probability that a significant number of actors in any nation, and not just NSA, has access to many of them.

Now, what?

What if instead we flipped it over, and created a standard body named Trustless Computing Group based on free software and hardware-based security-through-transparency paradigm, that would use the same user-verifiable processes to guarantee (1) unprecedented privacy and freedom to user, and (2) unprecedented security to the content owner!? Why can’t the same assurance socio-technical processes guarantee both users data and content owners data?!

That’s what we are aiming at with the User Verified Social Telematics project and related draft campaign for international standard and campaign for governmental legislations promoting it.

Alternative names for it:

Trustless Computing?!

Trustless Telematics!?

Verified Telematics!?

User Verified Telematics?!

Transparent Telematics!?

Got any suggestions? …

Nov 24th UPDATE : (1) Some typos and non clear passages have been revised. We have started setting up such consortium, although it is temporarily called User Verified Social Telematics Consortium.

Don’t be fooled, a way out of hardware backdoors exists!

This latest 60 seconds video excerpt (32.40-34.00) by Bruce Schneier, and this oct 2013 MIT Review article, show how extremely complex, widespread and probable is the problem of firmware or physical backdoors inserted in extremely widely-used hw components, during device manufacturing process.

That is only expected to get worse as, post-Snowden, both illegal or unconstitutional spying by state and non-state entities will increasingly have to rely on expanding the capabilities of automated-targeted critical exploitation of millions of end-user device, as most internet traffic and data will be encrypted, and most widely used software for encryption and onion routing gets improved and hardened for security.

Schneier and the MIT article author implicitly or explicitly state there is nothing that can be done to assure users in regards to their safety against such huge current threat.

I believe Schneier is wrong by saying that there is nothing to do or, better, I think he really meant to say there is nothing to do if we want the type of feature and application richness we are used to with today’s mainstream commercial computing, as they are either in cahoots with one or more national governments and/or their complexity is way beyond the ability of anyone to verify them adequately.

The solution is simply to simplify!

The solution is hinted at in a statement by the DARPA representative in the mentioned MIT article, when he said:

DoD relies on millions of devices to bring network access and functionality to its users. Rigorously vetting software and firmware in each and every one of them is beyond our present capabilities, and the perception that this problem is simply unapproachable is widespread.

I’d argue that what he really means when he talks about the large number of DoD devices, is not really their number in units, but really the number of different DoD devices and the complexity of many or most of such DoD them of them.

That makes sense since, given very large but still limited budgets, to have complete verifiability and adequate verification of every hw component on a given device can be done for a few targeted, and extremely simple, hw platforms, albeit with huge upfront cost ( and relatively very low marginal costs).

The same process would instead be hugely costly, or effectively impossible, for more complex devices that rely on large number of complex components from many different third parties, where adequate access to manufacturing processes oversight may be hugely costly or impossible – even through enhanced versions of programs such as the DoD Trusted Foundries Program (TFP) – for obstacles due to matters of IP protection, corporate choices or national security agencies interests of the nation hosting the fab.

The solution is therefore to focus limited resources (high 1 or low 2 digit $ millions) of a an international joint venture of private, no-profit and ethical hacker communities (supported by private funds, partnering IT companies, state and foundation grants) on a single minimal hw platform (or SoC). Such platform: is suitable – albeit with very severe performance and functional limitations– for server, router and handheld end-user device for basic communications; has extreme simplicity of features, hardware and software; and has complete verifiability and enacts adequately-extreme and open verification.

The resulting levels of assurance and consequent value to ordinary users and to ultra-critical users would produce large revenues, to gradually expand capabilities and features, without reducing and possibly increasing the assurance level. User Verified Social Telematics.

India eyes to become world leader of privacy-enhnancing technologies

CIS INdia, the leading IT rights think tank in India, is proposing that India bet on world class IT privacy as a key competitive advantage for it’s IT industry. They are seeing how, the fullest protection of digital human rights can, in a post-Snowden world, can become the primary competitive advantage of the IT industry of an entire nation.

We pointed that opportunity during our event in Rome with Richard Stallman, named “Full realisation of citizen digital rights as huge economic opportunity for the Lazio Region“.

Italy could do that as well – or even before India or Brazil, or Switzerland – as we are proposing with our Open Media District project, that includes a dedicated technology park, dedicated supporting legislations and large trail blazing R&D projects, such as User Verified Social Telematics.

Here’s what CIS envisions:

Post-Snowden, the so called swing states occupy the higher moral ground. It is time for these states to capitalize on this moment using strong political will. Instead of just being a friendly jurisdiction from the perspective of access to medicine, it is time for India to also be the enabling jurisdiction for access to knowledge more broadly. We could use patent pools and compulsory licensing to provide affordable and innovative digital hardware [especially mobile phones] to the developing world. This would ensure that rights-holders, innovators, manufactures, consumers and government would all benefit from India going beyond being the pharmacy of the world to becoming the electronics store of the world. We could explore flat-fee licensing models like a broadband copyright cess or levy to ensure that users get content [text, images, video, audio, games and software] at affordable rates and rights-holders get some royalty from all Internet users in India. This will go a long way in undermining the copyright enforcement based censorship regime that has been established by the US. When it comes to privacy – we could enact a world-class privacy law and establish an independent, autonomous and proactive privacy commissioner who will keep both private and state actors on a short lease. Then we need a scientific, targeted surveillance regime that is in compliance with human rights principles. This will make India simultaneously an IP and privacy haven and thereby attract huge investment from the private sector, and also earn the goodwill of global civil society and independent media. Given that privacy is a precondition for security, this will also make India very secure from a cyber security perspective. Of course this is a fanciful pipe dream given our current circumstances but is definitely a possible future for us as a nation to pursue.

 

 

How Tails could bring privacy to all with 8M euros

Tails, the free software USB Gnu/Linux  OS, is reportedly used by Snowden and Schneier as their main secure desktop platform. 

It’s definitely a major step ahead respect to everything else. But, aside from its poor usability and availability only for PC, does it provide nearly enough privacy and security after the what has come out in the last year?!

I see major potential critical vulnerabilities (to scalable remote exploitation coming) from:
-way too large OS and apps, even if severely stripped down and hardened
-not nearly enough expert verification per quantity of code
-no public background checks on contributors and lead developers and architects (which are anonymous)
-users ‘firmware
-users’ hardware
-Tor network vulnerabilities due to: traffic analysis, bug in poorly verified floss code (such as OpenSSL), low number of expected non-malicious and competently-managed nodes.

I imagine Snowden and Schneier protect from these through setups and configurations, rules of behavior,  .. But such tricks require very high skills, shared by your communications interlocutor, and they drive usability even lower.

We at the Open Media Cluster believe to have identified a solution to such vulnerabilities and usability problems of Tails (and similar), that could cost under 8M€ of R&D to build and test, and be made affordable and usable by any Western citizen, as a parallel environment for secure computing.

It involves modifying Tails by:
-stripping it down to very basic features
-embedding it in a barebone 3mm touch screen device with hdmi out (to display on your desktop monitor) and bluetooth (to go on the Net via your phone), that can be attached to the back of any phone via a hard case.
-adding very very thorough (relative to quantity of code) and open verification to all software and firmware
-add manufacturing process oversight exceeding in user-verifiability the US DoD “Trusted Foundry Program”
-improve Tor security and performance through traffic spoofing techniques, direct incentives for non-malicious and properly configured nodes, and very extensive Tor code review
-a few more tricks

See more at the User Verified Social Telematics project.

Consequences of letting mobile Apps marginalize the Web

The decline of the mobile web

The likely end state is the web becomes a niche product used for things like 1) trying a service before you download the app, 2) consuming long tail content (e.g. link to a niche blog from Twitter or Facebook feed).

This will hurt long-term innovation for a number of reasons:

1) Apps have a rich-get-richer dynamic that favors the status quo over new innovations. Popular apps get home screen placement, get used more, get ranked higher in app stores, make more money, can pay more for distribution, etc. The end state will probably be like cable TV – a few dominant channels/apps that sit on users’ home screens and everything else relegated to lower tiers or irrelevance.

2) Apps are heavily controlled by the dominant app stores owners, Apple and Google. Google and Apple control what apps are allowed to exist, how apps are built, what apps get promoted, and charge a 30% tax on revenues.

Is this battle only global, or is possible to win or partially win this battle within a continent or a single nation where major public and private actors gather to defend the Web?

Newly announced Google’s Tango project validates UVST project, but hardly competes with it

Yesterday, just before the Mobile World Congress 2014 in Barcelona, Google has announced with wide media coverage (Gigaom, ArsTechnica, VentureBeat), its latest mobile device innovation, Google Tango, a new smartphone with 3D sensors in the backface that provide kinect-like functionalities on the move and in the living room, for fun, games and beyond.

All of Tango’s capabilities, features and user experience have, for 3 years already, been fully part of the CivicPod, the core end-user device of our User Verified Social Telematics (UVST) R&D project, except the CivicPod provides substantial additional features and advantages, at a lower cost and while being to a wide extent Tango-compatible, albeit with lower performance. As UVST, Tango is also an open innovation project, developed with over 16 world private and public research centers.

In UVST, 3D sensors, such as those of Tango, are embedded in the CivicPod, a 3mm-thin Bluetooth-connected touch-screen device with 2 dual front-facing cameras with refractive lenses, that can be attached to the user’s smartphone through a custom rigid case, or to the TV frame though a dedicated docking station.

So therefore in addition to Tango capabilities, the CivicPod user can:

  • Just buy a ultra-thin user-friendly multi-function peripheral embedding such Tango- compatible Kinect-like sensors, instead of buying a new dedicated smartphone, which brings to the user: huge cost savings, the ability to easily such port the sensors to its your next smartphone, the ability to use its smartphone while the sensors are active for on-TV living-room applications, and just 1.5mm of additional thickness.
  • Access most of Tango applications, since for Tango SDK developers, wanting to port their apps to CivicPod, it is just a matter of adding Bluetooth APIs to the application, and account for very minimal delay added by Bluetooth connection.
  • Access by default a Tango-compatible CivicPod application that enable its use as an highly-innovative, ergonomic and immersive «magic» touch-based control of on-TV content, available through a dedicated cheap CivicDongle, ChromeCast and other compatible TV-connected devices. – Through 2d front-facing cameras with refractive lenses, the position of the user’s finger tips above the CivicPod screen are tracked and relayed wirelessly to such TV-connected device and made visible on the TV screen as halos of varying size. Finger position information appears as a semi-transparent video-overlay stream on the TV screen that decrease in opacity and size as the fingers gets closer to the CivicPod screen. Touch events are also relayed to the CivicDongle to trigger touch events on the CivicDongle UI, and therefore on the TV screen. Therefore, overall the user gets the experience of «touch controlling» their TV from the comfort of his sofa (or bed), but while looking at all times to the TV screen instead of the CivicPod screen, including while typing on a virtual keyboard without having its finger hiding the key about to be pressed.
  • Access to ultra-private mobile&desktop communications and social features, with other CivicPods, through UVST leading-edge end-to-end privacy-enhancing architecture, and unprecedented verification organizational processes, which even include “user-verifiable” hardware manufacturing oversight procedures that exceed those of US Dept.of Defense “Trusted Foundry Program”.

For more information see the UVST project web page.

Privacy Oversight Board Agrees with EFF: Mass Surveillance Is Illegal and Must End

Special independent Committee appointed by Obama reports a few days ago on NSA activities:

Based on the information provided to the Board, including classified briefings and documentation, we have not identified a single instance involving a threat to the United States in which the program made a concrete difference in the outcome of a counterterrorism investigation. Moreover, we are aware of no instance in which the program directly contributed to the discovery of a previously unknown terrorist plot or the disruption of a terrorist attack.

Up to millions of end-user devices may be remotely snooped upon

Additionally, under an extensive effort code-named GENIE, U.S. computer specialists break into foreign networks so that they can be put under surreptitious U.S. control. Budget documents say the $652 million project has placed “covert implants,” sophisticated malware transmitted from far away, in computers, routers and firewalls on tens of thousands of machines every year, with plans to expand those numbers into the millions.

http://m.washingtonpost.com/world/national-security/us-spy-agencies-mounted-231-offensive-cyber-operations-in-2011-documents-show/2013/08/30/d090a6ae-119e-11e3-b4cb-fd7ce041d814_story.html