Friday, March 22, 2013

Security Implications from One Year on Mobile Only

Benjamin Robbins (@PaladorBenjamin) just completed 52 solid weeks working solely on mobile. Of course there were some issues, but he did it and the lessons learned are instructive.

A key takeaway:
    From a practical perspective I’ve learned that there are certain needs of human ergonomics that you just can’t engineer your way around no matter how cool the technology. I can say with confidence that a monitor and keyboard are not going anywhere anytime soon.
This is a key insight for people in mobile security.  Its not Mobile only that we should be designing for. Its Mobile +. Mobile and something else, on top of that any number of hyrbid models like BYOD and COPE.

Your mobile device is an extension of other things, its not a full replacement. So as someone designing security and identity services for mobile, you have to be able to mesh that identity with the server, the other machines and the directory management systems.

It tempting to think of machines and mobile devices as islands that we need to protect (enterprise archipelago security architect?), but this is to miss the point. The mobile device needs data input from other places (likely by people using keyboards ;-P), need access to documents, and they need server side communications. Users also want something resembling a consistent set of access rights no matter what platform they are using - laptop, webapp, mobile, workstation or tablet. These are unsolved problems in the security and identity industry today.

Still Benjamin Robbins' piece is a great testament to, practical issues aside,  how far things have come in a short while for mobile. I continue to expect that we see more mobile apps not less and that the devices will snowball on top of the servers, browsers, services, and desktop/laptop machines you already have to cope with. Design your security services accordingly.


**
Three days of iOS and Android AppSec training with Gunnar Peterson and Ken van Wyk - Training dates NYC April 29-May 1

Tuesday, March 19, 2013

US FTC fires a warning shot in the mobile software security wars

If you weren't looking carefully, you probably weren't even aware of it. (Indeed, I hadn't seen it until I read John Edwards's piece over at The Mobility Hub.) But, make no mistake about it, this is a big deal for the software industry. The ramifications could be far reaching and could end up touching every company that develops software (at least for US consumers).

What's the big deal? HTC America recently settled a complaint filed against them by the Federal Trade Commission. The terms of the settlement force HTC to develop patches to fix numerous software vulnerabilities in its mobile products, including Android, Windows Mobile, and Windows Phone products.

Blah blah blah, yawn. Right? WRONG!

What makes this case interesting to software developers in the mobile and not-mobile (stationary?) worlds is the litany of issues claimed by the FTC. Among other things, FTC claims that HTC:

  • "engaged in a number of practices that, taken together, failed to employ reasonable and appropriate security in the design and customization of the software on its mobile devices";

  • "failed to implement an adequate program to assess the security of products it shipped to consumers;"

  • "failed to implement adequate privacy and security guidance or training for its engineering staff;"

  • "failed to conduct assessments, audits, reviews, or tests to identify potential security vulnerabilities in its mobile devices;"

  • "failed to follow well-known and commonly-accepted secure programming practices, including secure practices that were expressly described in the operating system’s guides for manufacturers and developers, which would have ensured that applications only had access to users’ information with their consent;"

  • "failed to implement a process for receiving and addressing security vulnerability reports from third-party researchers, academics or other members of the public, thereby delaying its opportunity to correct discovered vulnerabilities or respond to reported incidents."
Oh, is that all? No, it's not. The FTC complaint provides specific examples and their impacts. The examples include mis-use of permissions, insecure communications, insecure app installation, and inclusion of "debug code". It goes on to claim that consumers were placed at risk by HTC's practices.

Now, I'm certainly no lawyer, but reading through this complaint and its settlement tells me that the US Federal Government is hugely interested in mobile product security -- and presumably other software as well. I don't know the specifics of just what HTC really did or didn't do, but this sure looks to me like a real precedent nonetheless. It should also send a firm warning message to all software developers. There but for the grace of God go I, right?

Reading the complaint, there are certainly some direct actions that the entire industry would be wise to heed, starting with implementing a security regimen that assesses the security of all software products shipped to consumers. Another key action is to implement privacy and security guidance or training for engineering staff. That list should go on to include assessments, audits, reviews, and testing products to identify (and remediate) security vulnerabilities.

There are many good sources of guidance available today regarding this sort of thing. Clearly, we believe mobile app developers could do a lot worse than attending one of our Mobile App Security Triathlon events like the one we're holding in New York during April. But that's just one of many good things to do. Be sure to also look at the Build Security In portal run by the US Department of Homeland Security. OWASP's Mobile Security Project can also be useful in looking for tips and guidance.

Come join us in New York and we'll help you build your mobile app security knowledge, as well as provide many pointers to other useful resources you can turn to so that your organization isn't so likely to find itself in the FTC's crosshairs.

Cheers,

Ken van Wyk

Schneier Says User Awareness: Tired, Dev Training: Wired

Bruce Schneier tackles security training in Dark Reading. He basically says that training users in classic "security awareness" training is a waste of money. Certainly there is a lot of evidence to back up that claim, users routinely click on certificate warnings, for example.

What I found most interesting is what Bruce Schneier recommended to do instead of security awareness training for users:
we should be spending money on security training for developers. These are people who can be taught expertise in a fast-changing environment, and this is a situation where raising the average behavior increases the security of the overall system.

If we security engineers do our job right, users will get their awareness training informally and organically, from their colleagues and friends. People will learn the correct folk models of security, and be able to make decisions using them. Then maybe an organization can spend an hour a year reminding their employees what good security means at that organization, both on the computer and off. That makes a whole lot more sense.
Of course I wholeheartedly agree with this. Let's say doing a great job on security awareness training for users, best case, maybe takes the rate of users clicking through cert warnings from 90% to 80%.

On the other hand, developers, security people and architects are actually building and running the system. If they know how to avoid mistakes they are in a position to protect across all the app users from a broad range of threats.

This is the essence of what Ken and I focus on in Mobile App Sec Triathlon training. I wrote about it in Why We Train. We want to help developers, security people and architects recognize security problems in design, development and operations; and, crucially, have some concrete ideas on what they can do about them.

Companies are scrambling to get "something" up and running for Mobile, either enterprise side or customer/external facing or both. It really reminds me of the early days of the web. A lot of this is vert fragmented inside of companies. A lot is outsourced, too. Ken and I put a lot of thought into the three day class so that its focused on what companies want and need.

Choose Your Own Adventure
Day one is about  mobile threats that apply to all platforms, architecture, and design considerations. We look at threat modeling for Mobile. We drill down on the identity issues for mobile, server side and what makes a Mobile DMZ. The class is setup so that architects and dev managers may choose to just attend day one.

Days two and three are hands on iOS and Android. Depending on what your company is building and/or outsourcing. You come out of these days knowing how to avoid security pitfalls in coding for mobile. Whether you are doing the dev in house or working with a provider, developers and security people will have a deeper understanding of the core security design and development options for building more secure code.

We recently announced scholarship program for students and interns. Based on past trainings, this has proven to be a great way to get fresh perspective on mobile trends. Finally since many companies are launching new mobile projects, we often see whole teams that need to get up to speed on issues rather quickly (before deployment0, so to serve this need we offer a group discount, send three people and the fourth comes free.

Overall our approach is geared towards adapting to the things that are most useful to companies trying to build more secure mobile apps. Training developers on secure coding is not yet a sina qua non, but for those that invest in building up skills and expertise it pays dividends in protecting your users, data, and organization.

**
Three days of iOS and Android AppSec training with Gunnar Peterson and Ken van Wyk - Training dates NYC April 29-May 1




Monday, March 18, 2013

ANNOUNCING: MobAppSecTri Scholarship Program


For our upcoming three-day Mobile App Sec Triathlon in New York City on April 29 - 1 May, we are once again running a student / intern scholarship program.

We will be giving away a few student / intern tickets to the event absolutely free to a small number of deserving students / interns.

Course details can be found here.

Requirements

To be considered for a student / intern free registration, you will need to submit to us by 31 March 2013 a short statement of: A) Your qualifications and experience in mobile app development and/or information security, and B) Why you deserve to be selected. Candidate submissions will be evaluated by the course instructors, Gunnar Peterson (@OneRaindrop) and me (@KRvW). Decisions will be based solely on the quality of the submissions, and all decisions will be final.

Details

All scholarship submissions are due no later than midnight Eastern Daylight Time (UTC -0400) on 31 March 2013. Submissions should be sent via email to us. Winning entrants will be notified no later than 15 April 2013.

Student / intern ticket includes entrance to all three days of the event, along with all course refreshments and catering. Note that these free tickets do not include travel or lodging expenses.

Wednesday, March 13, 2013

What can/should the mobile OS vendors do to help?

Mobile device producers are missing important areas where they can and should be doing more.

What makes me say this? Well, I was talking with a journalist about mobile device/app security recently when he asked me what the device/OS vendors can do to help with security for end consumers. Good question, and I certainly had a few suggestions to toss in. But it got me thinking about what they can be doing to make things better for consumers. And that got me thinking about what they can be doing to help app developers.

On the consumer side, the sorts of things that would be on my wish list include:

  • Strong passcode authentication. On iOS, the default passcode is a 4-digit PIN, and many people disable passcodes entirely. Since the built-in file protection encryption key is derived from a combination of the hardware identifier and the user's passcode, this just fails and fails. Even a "protected" file can be broken in just a few minutes using readily available software that brute force guesses all 10,000 (count em) possible passcodes. Well, a stronger passcode mechanism that is still acceptable to end consumers would be a good start. There are rumors of future iOS devices using fingerprint scanners, for example. While biometric sensors aren't without their own problems, they should prove to be a whole lot better than 4-digit PINs.
  • Trusted module. Still picking on iOS here... Storing the encryption keys in plaintext on the SSD (NAND) violates just about every rule of safe crypto. Those keys should be stored in hardware in a place that's impossible to get to programmatically, and would require a huge cost to extract forensically.
  • Certificates. Whether they are aware of it or not, iOS users use certificates for various trust service on iCloud and others like Apple's Messages app. Since they're already generating user certificates, why not also give all iOS users certificates for S/MIME and other security services. That would also open up to app developers the possibility of stronger authentication using client-side certificates.

Here are a few of the things I think would be useful to mobile app developers, in no particular order:

  • Authenticator client for various protocols. There are various ways to build an authenticator into a mobile app. In their various SDKs, it would be useful for device vendors to provide authenticator examples for popular authenticator protocols and services such as Facebook Connect and Google Authenticator.
  • Payment services. Similarly, example code for connecting to PayPal and other payment services back-ends would be useful. We're seeing some of those coming from the payment providers themselves, which is great, but it's been a long time coming.
So, I have no inside knowledge at Apple or Google for that matter, but it's always nice to dream. A few relatively small enhancements to the underlying the technology could open up all sorts of possibilities for users and developer alike. As it stands, an app developer writing a business app on iOS app has to build so many things from scratch, as intrinsic options for safe data storage, transmission, etc., are just not acceptable for today's business needs.

How about you? What would you add or change on these lists? What are your pet peeves or wish list items? We'd love to hear them.

Come join Gunnar (@OneRaindrop) and me (@KRvW) for three days of discussing these and many other issues in New York at our next Mobile App Sec Triathlon, #MobAppSecTri.

Cheers,

Ken


What Comprises a Mobile DMZ?

I have a new post on the Intel blog on Mobile DMZs. The post looks at what part of Identity and Access Management, Defensive Services and Enablement are the same for Mobile and what parts adapt?

**
Three days of iOS and Android AppSec geekery with Gunnar Peterson and Ken van Wyk - Training dates NYC April 29-May 1