We are living in troublesome times. Time of terrorism threats, security issues and our private freedoms and right for privacy is endangered more than ever.
Governments and their intelligence agencies all around the world are using these real threats and dangers to diminish our right for privacy more and more with the pretext fighting the same one. Do not misunderstood me it is their job to do that and they should fight terrorism and security threats but not at the expense of our own security and our global right for privacy.
Who has the access to our private information your tech giant , your provider, your hosting service etc. What permission you have granted to them and what you did not ? How they are handling your private data and how much effort are they using to protect you and your privacy ? It is the most important thing and basic human right. You chose who to trust but their business is to justify that trust.
Power of information is power of knowledge but it should not include our private life being scanned , collected and put in some storage due to someone bad judgment , political or business interest, fear, will and power to be able to decide and to do this with the excuse “It is better to prevent then to cure” or worse “Lets have it in case we need it”. It is wrong.
There should be legal and lawful procedure to these things like court order, proper investigation and true intelligence information about someones involvement in these kind of security threats. Backed up with strong evidence of suspicion before government issues legal surveillance methods to screen someones privacy. And when those measures are being used and approved the law and justice should not finish there. Surveillance methods should be enforced only with the legal procedures that must not be overstepped for example: what kind and how much of information they have and should collect, who has the access to these kind of information, who oversees the surveillance process and agencies that have the ability to do this , what is being done with information collected how they are handled , what kind of information you should be able to collect and store and what not, for how long these information can be stored and where? who has the access ?, if someone is being proved not guilty what then ? that person should be informed and their privacy restored ? the most important thing , is it all by law and not some grey area where everything is permitted. And after the surveillance process is finished justice system should take all those findings in consideration and scrutiny using the highest legal and moral tools at their disposal. There are many more issues that must be and should be take into the consideration but this article is not about this.
Some people in security business are going to say these are troubled times and that people from who we are protecting you do not care for anything ans that we should dirty our hands also to win this fight. No it is wrong!
If we chose to fight the darkness with darkness then the darkness is only what is left and they win. We should not sacrifice our way of life our believes and our light that is just at what they are aiming.
Until these issues are addressed properly we should not take security for granted and take things in our hand and try to stay ahead of times.
The purpose of this article is to make you feel more secure and to give you some security and privacy review on the services that we are all using like Face , Viber , BBM etc. TBU NEWS does not says that these services are totally insecure NO (they are or some of them are also trying to give their best to tackle the security issues and play this game of cat and mouse) but if you want some extra security YES they are. TBU NEWS is going to try to help you to decide which kind of service you should chose when you want some extra security and privacy (not all the time or maybe all the time) but if you chose them you should always have in mind to be updated on the info on these matters. Because this is the oldest game of cat and mouse.
By TBU NEWS
Which apps and tools actually keep your messages and calls safe?
In the face of widespread Internet surveillance, we need a secure and practical means of talking to each other from our phones and computers. Many companies offer “secure messaging” products—but are these systems actually secure? We decided to find out, in the first phase of a new EFF Campaign for Secure & Usable Crypto.
This scorecard represents only the first phase of the campaign. In later phases, we are planning to offer closer examinations of the usability and security of the tools that score the highest here. As such, the results in the scorecard below should not be read as endorsements of individual tools or guarantees of their security; they are merely indications that the projects are on the right track. For practical advice and tutorials on how to protect your online communication against surveillance, check out EFF’s Surveillance Self-Defense guide.
THE GREEN IS GOOD
TBU NEWS has chosen two of them but you have to decide by your own.
For years, privacy and security experts worldwide have called on the general public to adopt strong, open-source cryptography to protect our communications. The Snowden revelations have confirmed our worst fears: governments are spying on our digital lives, grabbing up communications transmitted in the clear.
Given widespread government surveillance, why don’t people routinely use tools to encrypt their communications? Wouldn’t we all communicate a little more freely without the shadow of surveillance?
It boils down to two things: security and usability. Most of the tools that are easy for the general public to use don’t rely on security best practices–including end-to-end encryption and open source code. Messaging tools that are really secure often aren’t easy to use; everyday users may have trouble installing the technology, verifying its authenticity, setting up an account, or may accidentally use it in ways that expose their communications.
EFF, in collaboration with Julia Angwin at ProPublica and Joseph Bonneau at the Princeton Center for Information Technology Policy, are joining forces to launch a campaign for secure and usable crypto. We are championing technologies that are strongly secure and also simple to use.
The Secure Messaging Scorecard examines dozens of messaging technologies and rates each of them on a range of security best practices. Our campaign is focused on communication technologies — including chat clients, text messaging apps, email applications, and video calling technologies. These are the tools everyday users need to communicate with friends, family members, and colleagues, and we need secure solutions for them.
We chose technologies that have a large user base–and thus a great deal of sensitive user communications–in addition to smaller companies that are pioneering advanced security practices. We’re hoping our scorecard will serve as a race-to-the-top, spurring innovation around strong crypto for digital communications.
Here are the criteria we looked at in assessing the security of various communication tools.
This criterion requires that all user communications are encrypted along all the links in the communication path. Note that we are not requiring encryption of data that is transmitted on a company network, though that is ideal. We do not require that metadata (such as user names or addresses) is encrypted.
This criterion requires that all user communications are end-to-end encrypted. This means the keys necessary to decrypt messages must be generated and stored at the endpoints (i.e. by users, not by servers). The keys should never leave endpoints except with explicit user action, such as to backup a key or synchronize keys between two devices. It is fine if users’ public keys are exchanged using a centralized server.
This criterion requires that a built-in method exists for users to verify the identity of correspondents they are speaking with and the integrity of the channel, even if the service provider or other third parties are compromised. Two acceptable solutions are:
A key exchange protocol with a short-authentication-string comparison, such as the Socialist Millionaire’s protocol.
Other solutions are possible, but any solution must verify a binding between users and the cryptographic channel which has been set up. For the scorecard, we are simply requiring that a mechanism is implemented and not evaluating the usability and security of that mechanism.
This criterion requires that the app provide forward secrecy, that is, all communications must be encrypted with ephemeral keys which are routinely deleted (along with the random values used to derive them). It is imperative that these keys cannot be reconstructed after the fact by anybody even given access to both parties’ long-term private keys, ensuring that if users choose to delete their local copies of correspondence, they are permanently deleted. Note that this criterion requires criterion 2, end-to-end encryption.
Note: For this phase of the campaign, we accept a hybrid forward-secrecy approach with forward secrecy on the transport layer (for example through TLS with a Diffie-Hellman cipher suite) and non-forward-secret end-to-end encryption, plus an explicit guarantee that ciphertexts are not logged by the provider. This is a compromise as it requires trusting the provider not to log ciphertexts, but we prefer this setup to having no forward secrecy at all.
This criterion requires that sufficient source-code has been published that a compatible implementation can be independently compiled. Although it is preferable, we do not require the code to be released under any specific free/open source license. We only require that all code which could affect the communication and encryption performed by the client is available for review in order to detect bugs, back doors, and structural problems. Note: when tools are provided by an operating system vendor, we only require code for the tool and not the entire OS. This is a compromise, but the task of securing OSes and updates to OSes is beyond the scope of this project.
This criterion requires clear and detailed explanations of the cryptography used by the application. Preferably this should take the form of a white-paper written for review by an audience of professional cryptographers. This must provide answers to the following questions:
A clear statement of the properties and protections the software aims to provide (implicitly, this tends to also provide a threat model, though it’s good to have an explicit threat model too). This should also include a clear statement of scenarios in which the protocol is not secure.
This criterion requires an independent security review has been performed within the 12 months prior to evaluation. This review must cover both the design and the implementation of the app and must be performed by a named auditing party that is independent of the tool’s main development team. Audits by an independent security team within a large organization are sufficient. Recognizing that unpublished audits can be valuable, we do not require that the results of the audit have been made public, only that a named party is willing to verify that the audit took place.
We’ve discussed this criterion in depth in a Deeplinks post:What Makes a Good Security Audit?
Source – www.eff.org
The right for privacy and security is the basic Human Right of them all.