Blog

  • PHP Insecurity: File handling and remote code execution

    For better or worse, there are a lot of novice programmers hammering away at PHP scripts all over the planet. It is one of the most common web scripting languages. However, it’s simply too hard for a newbie PHP programmer to write secure PHP code. As I’ll demonstrate, it’s also impossible for even security minded PHP professionals to keep their applications secure due to the way PHP manages change to its ever-growing API. Their culture of “add stuff, but stuff the security implications” has to stop. Don’t get me wrong, I love change. I just don’t love the way the PHP project goes about it.

    Let’s take a non-hypothetical instance. Some functions are very familiar to Unix folks, like fopen(), fread(), fclose() and so on. In Unix, the semantics of these functions and the security issues surrounding them are well understood. However, in PHP, fopen() and friends are heavily overloaded, and gain new functionality between even minor PHP releases. For example, by default, PHP’s fopen() and several friends can open any file on the Internet. Producing a canonical filename which is safe is basically impossible in PHP.

    Take a typical PHP application using templated languages. A typical implementation will enumerate a directory to see what files are available (English.lang.php, русски.lang.php, etc) and then try to “fix” it up. The attacker will then try to substitute ../../../../../etc/passwd or something similar. Nothing new here for our Unix friends. But what about going offsite? Well, the top vulnerability for PHP applications in 2005 is remote file inclusion and it uses this exact same mechanism.

    The usual type of thing I see all the time:

    $language = $_POST[‘language’] + “.lang.php”;
    include ($language);

    Of course, the security people reading this are going “nononononno!”. But to the average PHP programmer, why should it be any harder? PHP just made a basic idea very hard to get right. This is not to say J2EE or ASP.NET are invulnerable to this type of boneheaded programming, but they don’t allow you to include files from over the Internet and then evaluate their contents.

    What about if we move to file_get_contents() instead of including the result? file_get_contents is rarely used as it is a PHP 4.3.0 and later construct, and PHP coders are reluctant to use new fangled calls when old ones will do. However, it is no better! It STILL allows us to read the file directly from a URL or via a wrapper, like php://output (which acts like echo… with the usual association of data… XSS city), or php://filter/resource=http://www.example.com … and this is NOT restricted by allow_url_fopen. Who comes up with these settings?

    Programmers are usually surprised at the wide number of places what used to be local file operations are able to be used for remote file and file filters. The job is made harder because PHP keeps on changing its mind about what is available. What used to be a safe application with PHP 4.2.x is no longer safe in PHP 4.3.x or PHP 5 – just because PHP changed.

    Accompanied by extremely fragmented documentation (ie “see Appendix L” or read the usually extensive comments to see how the functions ACTUALLY work), it takes experience to program PHP’s file operations. With a very low barrier of entry, PHP needs to keep these advanced features to those who know what they’re doing. However, it’s far too late. PHP is used by programmers of many different skill levels. The average Joe programmer has no help in hell of writing a safe PHP application.

    In the meantime, let me plug Chris Shiflett’s brand spanking new PHP Security book from O’Reilly:
    Amazon Listing

    If you want to write secure apps in PHP, you need that book.

  • On the failure of genre

    On the weekend, I bought two CDs:

    • Goldfrapp – Supernature
    • Ministry of Sound – Sessions

    I couldn’t find the Goldfrapp. At all. Despite being heavily promoted in store, with a large 10×10 m poster dominating the front window, they did not have any on a display stand, and they didn’t put it in the music genre sections I would have expected it, like “Popular” or “Electronic” or “World”.

    The morons had put it into the “alternative” section. The CD is about as alternative as Pink Floyd’s The Wall, which is (correctly) in the popular section.

    Genre-ification of music does not serve customers. I’m sure it makes the CRM enabled music execs wet their pants, but it leaves me cold and I buy a *bucketload* of music every year.

  • “Enterprise” levels of insecurity

    Why is it that “enterprise” applications have the worst security?

    If VXers researched this area, they could bring corporates all over the world to their knees.

    Typical mistakes include:

    • clear text management protocols
    • clear text authentication, if performed at all
    • excessive privileges required to do their tasks
    • poorly written and tested – it’s usually trivial to cause agents to seg fault or GPF with simple fuzz testing tools
    • Default configurations are insecure out of the box
    • Default username and passwords
    • require old software stacks which themselves have security issues
    • secretive and obtuse documentation particularly around security issues
    • Stupid limitations… like BMC Patrol’s requirement that all agents run at a matching security level … or else the console does not work. This makes for Big Bang changes in most environments which means no change.

    I could go on, but my blood is boiling. If you are buying management software, buy *secure* management software. Don’t trust the vendor to tell you about this – evaluate the software in your environment. Use Ethereal and ettercap to detect if it’s sending clear text or replayable secrets over the wire. Use the trial softare against a default installation and see if you can manage your test hosts with default passwords.

    Unbelievable.

  • On DDoS protections

    A number of gambling websites recently suffered massive distributed denial of service (DDoS) attacks. These are not the first, and certainly not the last massively DDoS attacks. In fact, the problem will only get worse as consumers place ever more unsecured equipment on the Internet via fast (enough) broadband links.

    The attacks

    The attacks are assumed to be of Eastern European origin, due to the mail servers they choose to use and their broken English. However, even this cannot be guaranteed. What is clear is that they do not write the attack tools, they just use them as part of their extortion racket.

    These attacks can be characterized into three major categories:

    • spoofed UDP traffic
    • spoofed SYN traffic
    • HTTP data sinks

    The first two are utterly preventable and should have never been able to traverse the Internet. The originating ISPs (wherever they are) are ultimately responsible for their customer’s traffic. My view is simple: if an ISP allows spoofed packets as part of a DDoS network, the proportion of losses should be recoverable from these criminally negligent ISPs.

    Countermeasures: network equipment manufacturers

    About the only welcome aspect of a shakeout in network equipment manufacturers is that it is simpler to address one of the core sources of the DDoS problem: ISPs letting spoofed traffic through their networks. If all digital modem concentrators, DSLAM’s and HFC headends had default and unstoppable anti-spoofing, high rate spoofed DDoS attacks would be impossible.

    How to categorize ISPs into those who do the right thing (block customer traffic with non-routeable source addresses)?

    My view is that a scheme which has a dynamic list of BGP AS’s of ISPs who have been audited as “source IP address correct” should be established. After a certain cut off date, any ISP who refuses to be audited, or does not return a audit status, will suffer the consequences of the Internet not passing their traffic.

    This would in a short stroke eliminate spoofed IP traffic as a source of DDoS packets. Even the largest bot nets contain only a few thousand hosts, and with real source IP addresses, technical schemes to rate limit IP addresses, dynamically deny IP addresses and other anti-DoS techniques could be implemented.

    The next portion of the DDoS answer is how to deal with infected hosts. My personal view is that a responsibility is shared between the owner of the infected equipment, the ISP and the vendor of the application or operating system which allowed the infection to take place.

    For example,

    • the customer should have a personal firewall on by default and run modern anti-virus software with recent updates;
    • the ISP should be aware of customers who have unusual traffic patterns and can easily recognize Trojan ports or activities;
    • vendors of software such as (but not limited to) mIRC, Kazaa, or Windows, contribute to the total number of DDoS bots in existence by not putting in controls which prevent distribution or control malware activity;
    • by default, any operating system which provides methods which enable direct access to the network card, or allow the creation of arbitrary packets should have an indefeatible error message for which the end user would have to agree prior to the application being able to communicate with the outside world.

    Although it can be successfully argued that this would only raise the bar in terms of how clever the malicious bots would need to become under such a regime, the reality is that few possess the skills to create new malware. There are a limited number and heavily inter-related attack bots in existance today. By raising the bar, the bots would be:

    • identifiable by source IP address
    • have limited spread
    • have limited opportunity to run successfully
  • Security architecture and document reviews

    I work in an environment where there is “implicit” review. That is, if you don’t respond in time during the comments period, you are assumed to sign off on the document. This is very dangerous for someone like me – I have tenuous links to the client organization, and although my hosting contracting organization has professional indemnity insurance to cover me, it only covers me as long as I’m not negligent. For example, not reviewing a document and letting it through.

    Now this is where it gets tricky – I feel I have to review the documents with my name in them. However, if I review a document, I’m not about to let a sub-standard document through. I will review it and offer advice not just as the subject matter expert, but also on ways to improve the document and make it better. Often times, these are accepted and all are happy.

    However, today I reviewed a truly sub-standard solution architecture document (known as a SAD). SADs normally take the business requirements and maps out in fairly specific non-techo language how a system will work and how it interacts with other systems. The developers take the SAD and turn it into an implementation of the SAD after writing a detailed design (DD). The DD will basically agree with the SAD’s approach, but minor details and extreme technical details will be included.

    For example, a well written SAD will have something like this for every feature which is required to be implemented:

    User Login Sequence

    Scenarios

    • User wants to log into the application.
    • Attacker may want to brute force the application
    • Sequence

    • User is registered using process defined in 3.2.1.
    • User navigates to web site and is presented with a login form (defined in 5.4.3)
    • User fills in username and password
    • System will take username and password and check to see if the password is correct. The username and password are stored in the user registry, following the security principles outlined in section 1.2.3 (ie no reversible passwords).
    • If the credential is correct, move to the foo page (see section 4.3.2)
    • If the password has expired, move to the password reset page (see section 1.4.2)
    • If the password is wrong, display a generic error message (as per message defined by the business – see Appendix A)

    You get the picture.

    Unfortunately, the SAD I reviewed today was so wooly that I think it described a system which had between three and five new screens. I’m not sure, you see. So I gave that feedback, along with another 60 mistakes I found. This didn’t go down too well with the author. I had forgotten the prime directive of security consultants – couch your criticism nicely rather than being brutally honest.

    How to write a report

    However, looking back at it, there is another issue – the author is not qualified to write SADs. No one is – there’s no definitive form that anyone can agree on.

    I feel that to ensure high quality software and secure solutions, there needs to be the One True Way to improve on the production of solution architectures. Five years ago, I think I’d be pushing the line that there needs to be a security architecture as well as a SAD, but no, we need solid solution architecture. Security is a key attribute – this is not in doubt, but everyone is responsible for security, not just the security slaves.

    That’s why I am writing a Solution Architecture Book rather than a Security Architecture Book.

  • Sick as a dog and procrastination

    I don’t know why but after being illl for a few days, I can’t get back into the swing of things.

    I went to work today for the first time since Tuesday last week, and that seemed to have helped. But I also P’d some time against the wall importing all the old Moveable Type blog entries here. MT was pissing me off – so many link spams. I’ve never had one here, so I moved all the posts from the two blogs here. It’s interesting to re-read some of my old posts.

    The book for A-W is definitely needed. Security Architecture is just so undefined – it’s a wide open field.

    OWASP Guide is also lagging – need to kick off the 2.0.2 edits and get that out the door. I’ve been playing around with InDesign to see if I can make a better looking export. The 2.0.1 print from Word is primitive.

  • DefCon Wrap Up

    Well, I’m back. Alien Andrew has departed, and it’s a nice cool 14 C again.

    Rolling back to Sunday afternoon…

    After posting my crop dusting blog entry, Mike P rolled up and we went off to the cafeteria to deconstruct the last few days. It was good to catch up before I left.

    At around 12.30, Chris and Jen from newbeetle.org rang, and we organized to go to the Hofbrauhaus a few doors away from DefCon. Well, was that a hoot! Lots of singing and being (very) merry care of copious quantities of good quality German beer, a decent meal, and I was ready to be poured on to the plane. Chris and Jen are the best! They even dropped me off at the airport.

    It was a good thing I was a bit sozzled – the TSA screening process is awful in its mediocrity and not ameriloated in any way by the absurd queue lengths. They didn’t ask to see my laptop working, they just wanted to XRay it. They didn’t hand search my luggage to determine if I had any ninja tools or anything like that. They just processed us like as if it meant something. TSA searches are a complete waste of time and are completely ineffective against an even half arsed adversary.

    I was waiting in the public lounge area for my flight when spontaneous applause from the public erupted when about 20 soldiers from Iraq returned on a flight. It’s good that even though the war is illegal and the actions of the US leadership dubious, the public still support their armed forces. I could see the smiles on the faces of the men and women returning, and I could see they appreciated the public’s support.

    The five hour stop over in LAX was ordinary with only one saving grace – I didn’t have to be re-screened.

    The flight home was long and terrible – United as per normal put their most elderly of planes on the LAX – Sydney route as only Qantas competes with them. Qantas also use retirement villa planes for this leg. Very uncomfortable thin seats, with no entertainment system in the seat backs to play with. My seat wouldn’t recline as far as the seat in front of me did, so I was squished most of the time. Plus, the seat pitch was tight – about the same as Virgin Blue’s domestic flights. I will not be travelling United again.

    Once we hit Sydney, we were screened… twice. Once when we left the plane to go back into the sterile duty free area near the gates and again when I re-entered the gate area. This was the only time my baggage was hand searched. I don’t know what they’d find after being screened several times already. Oh well, screening passes the time.

    I eventually hit home around 1 pm after travelling again for 32 hours. I missed Monday altogether. When the A380’s come out, I will fly whoever travels point to point: Melbourne – London non stop, bring it on! I hate being screened constantly and pointlessly.

  • DefCon Day 2 and a bit of Day 3

    Well, alien Andrew is in full control. I was up at 4.30 am on Saturday morning, and again this morning. Unfortunately, on Saturday, I crashed about 8.30 am and slept to near 2 pm.

    I went to DefCon at 3 pm to meet with Brenda from the Trike team. We had a good discussion on the approach to Trike for larger apps and how Trike might be suitable for inclusion into OWASP materials.

    At 4 pm, I met with Meredith and Robert from the u. Iowa Dejection project. I went back to Meredith’s hotel room and we had a very long and interesting discussion on the future of their project and many other topics. I bounced a few ideas off them. All in all, this was the best and most useful part of DefCon for me – the networking with clueful people.

    DefCon has been very disappointing. What the hell are folks working on? DefCon 13 has nothing really new or exciting. All the attacks I’ve seen (or reviewed in the PPTs on the CD-ROM) are derivative and boring. There’s so much web app sec stuff occuring and DefCon schedules not a single session, and yet has four back to back sessions on physical security on lame devices like door locks and safes. If we made encraption as bad as the average door lock or cabinet we’d be laughed out of the house. I can’t believe that people think that this stuff is bleeding edge.

    So today, I’m going to meet up with some folks from newbeetle.org – Jen and Chris, rather than sweat it out in any of the remaining sessions. The CTF competition is winding down behind me as I write, and the drum and bass sound track is getting thin. The geek next to me is listening to his own stuff rather than be pummelled by the incessant driving bass. Defcon needs good DJs who are aware of the many styles other than D&B. I love techno music, and this stuff is repetitive crap.

    The other thing that really really pisses me off is the smoking. WTF were the organizers thinking? There’s like 5% of the attendees smoking and they are making this room unbearable despite aircon working overtime. I’m going on a serious crop dusting mission soon. I love dropping my guts right around an active smoker. The dead thing emissions from my nether regions are nothing to the foul deadly particulate stream that I now stink of, thanks very much.

    The worst part is that outside is literally just 5 – 10 metres away. They could kill themselves out there and not harm me and the other 95% of the participants.

    If I come again, I’ll find a reliable source of onion rings so my crop dusting missions will be all the more fruitful. I have to spread the love.

    I have to be at the airport for 5.45 to make it back to LAX in time for my late night 16 and a bit hour flight to Australia. I miss out on Monday altogether. Oh well. I had two Tuesdays this week.

  • Day One of Defcon

    I woke up pretty early. Obviously as I am not alien Andrew, it must be the jet lag has not caught up with me yet. I took a long slow breakfast and decided to let everyone rush to the first sessions of DefCon without me. I prepared Mr Laptop for a day at DefCon. I turned on the firewall logs, I turned off inbound UDP and blue tooth. I logged out of all my favorite websites which have a clear text equivalent hash (such as slashdot or even this blog). I confirmed that I was using SSL to grab my e-mail. I fear that it’s not going to be enough.

    When I tried getting a cab, there was a queue a mile long. So I waited. It was 40 C at 9.30 am. Lovely.

    I finally got there around 10 am when the first sessions were kicking off, and I realized that effectively, if you’re not in the queue 15+ minutes before a session starts, you’ve got buckleys of seeing that session. DefCon has become too popular; only about half the attendees can see any particular session. In fact, this means you can only see about half the sessions if you stay to the end of each. So pretty much everyone leaves 10 minutes before the end. This is very disruptive to the presenter.

    I wanted to meet up with a bunch of people, so I gingerly turned on Mr Laptop and SMS’d a bunch of folks. I watched the sheep board to make sure I wasn’t exposing any passwords. Luckily, the answer was no. The Sheep board is a bit disturbing, they have a short film clip every time someone exposes a password in the clear whether by bluetooth (boy am I glad I turned it all off before I arrived). The film clips varies between the bunny being cleaned up by a Mercedes in a race, to a woman being mauled by a bear when she sits next to it and others. Each clips is only a second or so long, but you wish you hadn’t looked.

    Sure enough DefCon is filled to the brim with freaks. There’s the guy in a large woollen trenchcoat with a laptop in a harness on his back with two very large aerials. No guesses as what he is doing. There’s the various goths and so on who are there to be seen in their resplendant black battle gowns. It’s now 42 C. There’s the guy who is wearing a black skirt trying to desperately to make it out as if it’s a kilt. No sir, it’s a skirt. Here’s what he looks like – in fact I think I might have found his source:

    Skirts for freaks

    Then there’s the usual sort of script kiddie trying to attack other computers. The firewall logs I have are completely lame.

    One of the cool things about this DefCon compared to last time is that there seems to be more chicks. There’s like five of them. Unfortunately, four are goths, but the other one is pretty cute. I will keep you posted on this disturbing trend!

    I went to a few sessions, but they’re all network attack tools. The state of the art here really hasn’t moved along as much as I would have thought. I will be selecting day two talks a bit more carefully. Which could be tricky as there seems to be a hardware lock picking stream, leaving just two other streams.

    One talk I went to by accident was the NRMC. The tent was running late and the guy they had planned to present on the next generation of cryptanalytic hardware thought was presenting on Saturday. I stayed anyway as Simple Nomad was not presenting, which is good as Simple Nomad is a wanker of the highest order. After presenting a few tools and using only half an hour of their allotted hour, they started their Q&A spanking session. You could ask a question, but only if you had previously filled out their “ass release form”, which allowed them to spank you with one of four things:

    • “firm hand from any NRMC member of your choice”. They’re all blokes and only one looks like he could actually hurt you. The rest are weedy geeks
    • Paddle from a previous conference. Weedy geeks, shouldn’t hurt.
    • Copy of “Hacking Exposed, 3rd edition”
    • piece of paper with the words “Patriot Act” written on it. Gee, how funny is that?

    I didn’t hang around as they seemed to want to make love to Mudge on stage and I felt that Mudge wanted man meat sooo bad. Normally, I’m totally into that, but I’d already seen the presentation and I had a meeting to get to. The tools weren’t too bad, but the tools on stage had a highly inflated opinion of themselves.

    I met our new publisher, Bill Pollock from No Stach Press. He’s a really nice guy and he’s so totally on the same page as us. He took me to a Vietnamese chain restuarant (!), and I had some grilled sliced pork. I think we’ll get along just fine.

    After that, I was invited to a party at Hard Rock (again), but honestly, I just didn’t feel like being smoked upon, so I tried to go back to my hotel. The taxi line was more than a mile long, with hundreds waiting. A cab came every 30 seconds or so. I looked in the distance and I could see Caesar’s Palace. The cab ride in the morning wasn’t too bad… So I walked it back.

    Well, it took about an hour and was filled with bizarre stuff. These dogdy dudes were at a set of lights, and they wondered if I would buy their shitty old van, a bargain at $300. Yeah, riigggght. I kept on walking and nearly got cleaned up when I accidentally stepped on to the road without looking in the right direction. Luckily, the lanes are wide. Then near the main drag, I was being followed by a goth, and he was being followed by an extra from Mullet World. I looked with interest at the ticket window of Fashionistas, and waited for them to pass. And what’s not to like about looking at a hoarding of five beautiful women in fake wigs, knee high boots, riding crops and fishnets?

    I stopped off at the Bellagio’s fountains about 45 minutes into my trek. They started playing “A song for Guy” by Elton John, and the 600 metre long fountains in the middle of this man-made lake were doing this water ballet routine. It was so Las Vegas – one of the best gay love songs ever written playing through speakers hidden inside fake 19th century gas lamps spotted every few metres, beautiful water fountains routine obviously synchronized by a friend of Dorothy or two, all the while huge police and fire trucks are zooming down the strip drowning out the music with their sirens.

    I finally got back to the hotel room. I was bushed, hot and sore. Only one thing for it – I filled the spa. Any thought of going out to get smoky was eliminated when those gentle bubbles hit my sore feet.

  • Day Two of BlackHat

    The day kicked off fairly well, albeit tinged with disappointment and anger at Cisco for being Butt Heads.

    I read about Michael Lynn’s legal problems with Cisco in more detail. I tried finding him to offer my support, but unfortunately it’s a big conference and I bet he was lying low. I for one will be making sure that Cisco products are off my buy list for a long, long time. If they really think that squishing security researchers is the answer, then they do not deserve business. Fix your damn bugs, morons.

    I tried interesting the press in talking to me about web application security issues, but unfortunately, they seem to be curiously disinterested. I think the next time, I’ll ask Black Hat (or whoever) to organize a press conference as honestly, they are missing the major story.

    Robert J. Hansen and Meredith L. Patterson’s talk on Dejection, a mathematical model to detect dynamic injection patterns was an eye opener. This was the best talk for me so far this conference. I later had lunch with Robert, and we’re meeting with both of them again on Saturday to go through how their work might be referenced in OWASP. They are seeking patents and working with a VC, so it might be tricky to go forward without causing either themselves or OWASP issues.

    I bought a copy of Michael Howard, David LeBlanc, and John Viega’s new 19 Sins book during the first break.

    I went to Phil Zimmerman’s talk. It was encrypted VOIP. He uses Macs and talked off the cuff, but despite that, this was the least technical talk I went to at BlackHat. In some ways, I should have gone to another session as I didn’t learn that much. I read the new book rather than tune into the VOIP demo.

    I and about five others went to Tzi-cker Chiueh’s excellent if very technical talk on using x86’s segment registers to provide hardware array bound overflow protection. He was very thorough, but unfortunately, did not demonstrate the approach live. This is the sort of stuff that BlackHat should be concentrating on to some degree – preventing attacks using novel approaches. Unfortunately, too many people want to see the latest exploits.

    After lunch, I decided to try out the turbo talks. I went to Mike Pomraning’s talk on “not validating”, which actually was about validating. 🙂 I had a good long talk with Mike the night before, so I felt I should at least see his talk and heckle a bit 🙂 I sat with Robert and Meredith and we sort of heckled.

    I skipped a few of the next sessions as I didn’t really think they’d be that interesting (and more to the point, the CD-ROM materials had good presentations) to go read more of 19 Sins, after which I wrote up a preliminary review.

    The last session I attended was the Jericho Forum’s challenge, which is a deperimeterized architecture group. They were handing out prizes for best papers received which furthered their aims. None of the winners were there, but the papers themselves are fascinating. I’d really suggest you go get them:

    All of them are here:

    The winners:

    1. Balancing the equation
    2. Reference Architecture, Galwas
    3. Blind public key

    I am retaining my reputation for piking on social events. For some reason, I just didn’t feel like being smoked upon at the nightclubs where most of the apres parties were being held. In fact, I don’t know why they still allow smoking in the foyers of the conference during the day. Only a few took advantage of the smoking – most in IT aren’t stupid enough to smoke, but it was enough to make me feel queasy. I complained to the organizers at the end when I handed in my card. I’ll follow it up with an e-mail.