Friday, August 22, 2014

No, Wired - The Internet is Actually Pretty Safe

Wired ran this article today:

The Internet Is Way Too Fragile and Insecure. Let's Build a New One


Featuring this:

You may have had the bad luck of being stuck on a runway when a router failure in Utah grounded commercial flights around the country for several hours. Or maybe you were frustrated by not being able to access government websites the day the .gov domain administration had a glitch in its system. These minor mishaps over the past decade are early rumblings of an uncomfortable truth: The Internet is more fragile than it appears.

The problems with the .gov websites and the FAA were caused by accidents, but such accidents can have widespread effects. In 2008, censorship efforts by the government of Pakistan unintentionally caused YouTube to become inaccessible throughout the world. In another incident in 2010, much of the Internet was rerouted through China for a few hours, including traffic between US military sites. China Telecom plausibly claimed this was also an accident, but scenarios like this could be easily arranged.

Well, two main problems here:

1. As the article admits, those were human errors. A secure internet is never going to fix PEBKAC.
2. You may remember that recently there were reports of a Russian gang stealing over a billion passwords. Bruce Schneier, world-renowned security expert, had this to say in his recent Cryptogram:

I don't know how much of this story is true, but what I was saying to reporters over the past two days is that it's evidence of how secure the Internet actually is. We're not seeing massive fraud or theft. We're not seeing massive account hijacking. A gang of Russian hackers has 1.2 billion passwords -- they've probably had most of them for a year or more -- and everything is still working normally. This sort of thing is pretty much universally true. You probably have a credit card in your wallet right now whose number has been stolen. There are zero-day vulnerabilities being discovered right now that can be used to hack your computer. Security is terrible everywhere, and it it's all okay. This is a weird paradox that we're used to by now.

On this count, I am going to side with Schneier.

  

9th Circuit Takes Closer Look at Arbitration Clauses in Browsewrap Agreements

This decision was handed down by the 9th Circuit the other day, which, for those who follow such things, covers all of California, and is of extremely high importance for the entire tech industry as a result.

Let's summarize why it is important:

1. Browsewrap contracts have traditionally been upheld as valid by the Courts - this means that when you click "I Agree" when signing into a website or installing a piece of software, you are, in fact, agreeing to the dozens of pages of legalese you absolutely have not read.

2. Recently, big companies have been inserting a variety of very troubling, anti-consumer clauses into such contracts, including mandatory arbitration clauses and waiver of right to join class action suits.

(2) has been very troubling, because recently, the Supreme Court basically upheld the notion that by entering a shrinkwrap or browsewrap contract, you can agree to waive your right to participate in a class action suit, and instead have the dispute move to arbitrationThis is bad for consumers because, as customers of corporations themselves, arbitration bodies have a very strong incentive to side with corporations, over consumers, in order to get repeat business.

What is interesting in the above linked case, however, is that the Court basically said that an arbitration clause itself, as opposed to a clause relating about the waiving the right to participate in a class action suit in favor of arbitration, was being thrown out, with the reasoning of "Seriously... who reads those things!?" In other words, the Court said that a browsewrap contract that doesn't bring the mandatory arbitration clause to the forefront gives insufficient notice to the consumer - a very interesting ruling.

SCOTUS is the next step on this particular train - as it has the ability to undo a troubling history of the overreach of browsewrap contracts. It is yet to be seen how SCOTUS will rule on it - given the very pro-corporate history of its browsewrap and shrinkwrap agreements, I'm not holding my breath - but this decision has the possibility to wind back the clock a few steps in the favor of the average web consumer. It may also wind up, however, that next time you buy something on B&N you agree once to a clickwrap contract - and then hit "I Accept" a second time specifically when agreeing to arbitration language. Only time will tell.

Friday, May 30, 2014

Commercializing Open Source Licenses

Nearly a year ago in this blog, I had a post up arguing that Richard Stallman's position on the necessity of using strong copyleft licenses to protect the open source movement was misguided. I'm following that post, now, by explaining that, in fact, not only are strong copyleft licenses inappropriate for certain business cases, but, in others, they are a powerful tool in the monetization of commercial software - where Stallman seems to want to live in a world where copyleft licenses exist only to promote the open source movement as a whole. Let's review:

A "strong" copyleft license is a software license that requires all distributed derivative works of that software to be licensed under the same terms as the original license, which typically includes distribution of source code. E.g. GPL.

A "weak" copyleft license may allow works that are bundled with the original software to be distributed under a different license, as long as the original copyleft software remains unaltered and under the same license. E.G. BSD, MIT, Apache, LGPL (to a lesser extent).

Stallman has argued that unless we all use GPL for all of our libraries, the open source movement will be eaten by the commercial software industry. He is wrong for three sets of reasons:

1. In many circumstances, the GPL is fundamentally incompatible with business needs, and these business needs are simply not going away.
2. The free and open source movement has been shown to co-exist harmoniously with the proprietary, commercial software industry.
3. Strong copyleft licenses have been a powerful tool for the commercialization of proprietary commercial software, under a scheme of dual licensing, totally turning Stallman's vision for the GPL on its head.

(1) Business Needs
If you are distributing software1 that has trade secrets embedded in source code, clearly, a strong copyleft license would be inappropriate, as it would require public disclosure of trade secrets. This could be a trade secret ranging from anything from a financial hedging algorithm or controls of precision machinery, to graphics processing or even your secret fantasy football handicapping scheme. Laying your code bare would give all these secrets away, which, from a trade secret law perspective, would invalidate their standing as trade secrets. This is a big no-no.

Additionally, the terms of the Apple App Store make it very difficult to include GPL licensed software in apps. For instance, the App Store may distribute your software outside of the united states, and the GPL requires that you cannot restrict licenses for GPL licensed software, which includes geographic restrictions. So, if you are distributing software that has export restrictions, either due to technology, agreement, or privacy laws, you find yourself in a very tricky situation. It is navigable with clever engineering and proper lawyering, but it is an enormous headache.

(2) Harmonious Coexistence with Commercial Software
As stated in previous posts, the Ruby on Rails community basically lives off of the MIT, BSD and Ruby licenses, all of which are weak copyleft licenses. This is simply a fact that cannot be disputed. It may be the case that when Stallman first founded the GNU foundation, strong copyleft licenses were a necessity for the success of linux - given the nature of the atmosphere back then, with only a few large companies controlling the balance of commercially viable developers. However, as time has passed, the number of developers has grown tremendously, and they are not all controlled by a handful of old-world corporations. As a result, there are now many totally viable motivations for contributing to open source projects beyond mere legal compulsion to do so - the Apache foundation is an excellent example of this motivation. Quite simply, developers enjoy having access to tools with large user bases, the prestige and reputation of being an open source contributor, which may further a commercial career, and the sense of community that comes with being part of an open source project.

(3) Dual Licensing
Stallman has accepted that while Dual Licensing is legal, he is not a fan. In essence, Dual Licensing, (in certain circumstances called Single Vendor Commercial Open Source Business Model) is where a company may make their proprietary software available under the GPL and also under a commercial, proprietary license. MySQL is an excellent example of the success of this license. It can be thought of as a type of "freemium" model - as long as you are not distributing your software, you are free to use, study and modify GPL licensed software pretty much to your hearts content. This allows for academic use, and purely internal commercial use, e.g., a hedge fund can download MySQL and use it internally, modifying it as much as they want, without worrying about the copyleft provisions. However, if they want to license their hedge-fund approved version of MySQL, but don't want to release their entire codebase to the public, they need to pay Oracle for a commercial license. This is precisely what happened to MySQL, which has very successfully used dual licensing to create a substantial business.

In the end, I think that it is clear that strong copyleft software has a permanent place in commercial applications - but I also believe that at this point, the alternative motivations for contributing to open source software - beyond mere legal compulsion to do so - are more than sufficient to allow for a vibrant weak copyleft open source community to thrive.

However, if you are considering integrating GPL code into your proprietary, commercial software before it ships, I highly suggest you find a very competent lawyer.

Thursday, May 8, 2014

Revisiting AOL - Profits down 90% in Just Two Years

Last August I wrote this post:
Of the $541M in revenue, $361.2 comes from advertising, or almost precisely 2/3 of all revenue. Further, an entire $166M comes from subscriptions, which is codeword for dial up subscribers. That's right, a full 30.6% of AOL's quarterly revenue comes from people with dial up modems. So, AOL generated 97.6% of its Q2 income from advertising and dialup. That means that all of AOL's other products, besides advertising and dialup, account for less than 3% of its income. That is not a good sign.
Just to review, AOL defines "subscribers" in its 10K as:

As of December 31, 2013, we had approximately 2.5 million domestic AOL subscribers. Our subscribers are important users of Brand Group and Membership Group properties and engaging our subscribers, as well as former AOL subscribers who continue to utilize our free service plan, is an important component of our strategy. Our paid subscription plans provide bundles of products and services ranging from online storage, privacy and security solutions to technical support, back-up and unlimited dial-up internet access options, computer protection and partner discounts.

Today, Ars Technica has the headline: In one short year, AOL’s quarterly profits plunged 66 percent.

According to their 10K:


This decline has been the result of several factors, including the increased availability of high-speed internet broadband connections and attractive bundled offerings by such broadband providers, the fact that a significant amount of online content, products and services has been optimized for use with broadband internet connections and the effects of our strategic shift to focus on generating advertising revenues, which resulted in us essentially eliminating our marketing efforts for our subscription access services and the free availability of the vast majority of our content, products and services. Although we provide many additional products and services as part of our subscription plans, there can be no assurance that our subscribers will value the bundle of services we offer and they may cancel their subscriptions at any time. If any of these factors result in our Subscriber base declining faster than we currently anticipate, our subscription and advertising revenues could be adversely affected.
Here's the year-over-year:




Not looking so hot for AOL.

Wednesday, May 7, 2014

So it Begins: Startups Getting Hurt By Net Neutrality

But don't take my word for it.

Via MIT Technology Review, here is Brad Burnham of Union Square Ventures:

The cable industry says such charges are sensible, especially when a few large content providers like Netflix can take up a large fraction of traffic. But if deep-pocketed players can pay for a faster, more reliable service, then small startups face a crushing disadvantage, says Brad Burnham, managing partner at Union Square Ventures, a VC firm based in New York City. “This is absolutely part of our calculus now,” he says. 
Burnham says his firm will now “stay away from” startups working on video and media businesses. It will also avoid investing in payment systems or in mobile wallets, which require ultrafast transaction times to make sense. “This is a bad scene for innovation in those areas,” Burnham says of the FCC proposal.
Just a reminder to everyone, that recently, Mozilla proposed the following changes to the proposed FCC Net Neutrality Regs:

Mozilla's plan is a somewhat crafty attempt to avoid the worst of the political mess that reclassification would cause, by arguing that there are two separate markets: the markets for broadband providers to end users (i.e., our own broadband bills) and then a separate market for the relationship between internet companies (what Mozilla is calling "edge providers") and the broadband providers. Mozilla is saying that since these are separate markets, the FCC could reclassify just the connection between internet companies and broadband providers as telco services, and leave the last mile setup unchanged as an information service. Thus, it's arguing that the transit market more accurately reflects a telco service, and thus would be much easier to reclassify. In a sense, this would also be a way to attack the interconnection problem, which is where the net neutrality debate has effectively shifted. 
GigaOm comments on that same plan here, saying:

But can this work? It’s a neat way to call Wheeler’s bluff on the reclassification issue, which is so politically charged, that he truly can’t touch it. Instead of attacking the cable and telcos from the front on reclassification, he could sneak around from the side. However, Wheeler’s made statements in the past that indicate he’s okay with a double-sided market for broadband, which means he may not want to impose this new relationship on ISPs. 
Such an action would also undoubtedly lead to lawsuits if it were implemented, which throws net neutrality into doubt for even longer. However, it’s about time someone changed the terms of this debate to reflect how the internet has changed since 2002 when the FCC decided it wasn’t a utility. Since then, as people have abandoned ISP-specific email, portals and more to surf for content and choose services delivered from the wider internet, it’s clear that ISPs are a conduit for content and services, not a provider of them.
Mozilla seeks to get the FCC to recognize this in a way that might be politically viable. Hopefully the agency takes Mozilla up on the idea.


And, as a parting note, let's not forget that Tom Wheeler, the current head of the FCC, is a former Cable Industry Lobbyist. Or, to quote Consumerist on the matter: FCC Chairman: I’d Rather Give In To Verizon’s Definition Of Net Neutrality Than Fight. How, exactly, did Wheeler get this job again? Oh right. May have something to do with that 700k he raised for the Obama campaign. Smashing. 

Friday, April 4, 2014

Guide to Email Production From Gmail

After spending about six hours (and losing several pounds due to fever-level frustration) working on this problem, I believe I have finally figured out how to perform something resembling methodical eDiscovery for Gmail emails if you, like me, work on a Mac. You will need the following:

1. Gmail
2. Google Vault (this software, truly, is spectacularly bad)
3. Stuffit Expander (yes, seriously, like that one from 1997)
4. Mac Mail
5. Acrobat Pro XI

Instructions:

1. Enable Google Vault for whatever email accounts in which you need to perform your production queries.

2. Perform your queries.

3. Use Google Vault's incredibly ham-fisted "export" function for each of your queries.

4. Serially download each of the multiple mbox.zip files that result from your exports.

6. Unzip them with Stuffit Expander (for some unholy reason surely only known to Cthulu and other eldritch gods, Google has chosen to use pkzip, instead of you know, just freaking zip, so your OS will think the files are corrupt unless you download Stuffit Expander, a piece of software old enough it can vote).

6. Serially import each of the mbox files into Mac Mail.

7. Select all the contents of each mbox import folder, one at a time, in Mac Mail, and print them to .pdfs.

8. Now is the time to remove emails that you do not wish to produce for reasons of relevance or privilege. Google Vault simply does not have anything remotely resembling this function: you either export the entire results of a query in a single .zip, or nothing at all. Thanks, Google Vault. Further note, however, if you are not careful in this process, you may be running afoul of your requirements for privilege logging. While sorting these results, it is best to keep a separate folder for emails that you determine to be relevant, but privileged, in order to preserve them for future examination.

8. When you have all your .pdfs collected in a folder, open up Acrobat and use the "Bates Numbering" function in View -> Tools -> Pages (intuitive, right?) and assign Bates numbers.

9. Drag the files to a thumb drive.

10. Send the thumb drive to opposing counsel.

11. Curse every piece of software you had to use in this process for being ungainly, cumbersome, semi-functional and poorly designed for this purpose (note: except Stuffit Expander. It had one job and did it without any fuss, complaints or troubleshooting).

And people wonder why lawyers are always angry.

Thursday, February 20, 2014

No, Killing Net Neutrality Does Not Help the Underdog


Killing Net Neutrality Helps Underdogs Succeed


Go ahead and read it - but suffice it to say that I'm not going to engage with the majority of what is said there - I just don't have time.

I'd just like to address one point that is made therein:

Net neutrality activists often fear that because small content creators couldn’t compete in such a market, it would in turn reduce the diversity of internet content. (For example, some argue that users would “naturally gravitate” to the big brands who can afford to pay the bill if Comcast and Verizon do decide to charge them more for streaming video over their pipes.) 
But the video market is already very different from the rest of the web. Because it doesn’t make sense to build one’s own streaming infrastructure when you can embed a YouTube player that better delivers streaming capacity anyway, the “small guys” creating video content already work through large platforms such as Netflix and YouTube. In short, these intermediaries help to solve the capacity problem, countering whatever market power broadband providers might have.
Do you see the switch that has occurred there? Simply because right now many people are just becoming aware of the net neutrality debate because of what is happening with Netflix throttling does not mean net neutrality is about video†.

It's about developers.

Let me explain, because this is what net neutrality is about:

If you were to make a new awesome site, lets say, twitter for cats, lets call it kitter, and you were to put it on the web, you may wonder why no one is getting to your site, and why they all complain of a truly horrible user experience. Then, because the average bounce time of 15 seconds is less than your average load times of 30 seconds, you simply get no traction, despite the fact that we know there are literally billions - billions - of cats that want on to your site. Then you figure out why, when suddenly -as your blood runs cold - you get an email from Verizon demanding 50 cents per user to ensure load times of less than one second. Well, you sigh then take down your service, because you are working out of your garage while holding down a steady job as a graphic designer and cannot afford to pay several million dollars to get your first few hundred repeat users. That is the story of a world without net neutrality: killing ultralight, hugely innovative products before they can ever leave the garage or dorm room because they will have to pay ridiculous access fees to the networks before they can ever leave the starting gate. You know, just like little known services like such as Twitter and Facebook - both invented in dorm rooms, and both of which got huge traction long, long before making anything remotely resembling real money. If either had to pay huge up-front service fees to ISPs just to get their message out, they would have been dead in the water.

Also, please note that one of the authors writes from George Mason's Mercatus Center, which apparently is funded by Koch money. That may or may not matter, but from what I've seen they are pretty committed to letting markets regulate everything - and that stance alone makes their commentary on regulation more suspect than wherever their money may come from. Either way, take it with a grain of salt. As you should the entire op-ed.

Further note that the other author is from the Tech Freedom center, which, from my reading, is not about the freedom of tech, but of the freedom of tech from regulation. The list of goals on their own site:
  • 1. To make the case for pragmatic optimism by highlighting the benefits of technological change and bottom-up solutions to concerns raised by change.
  • 2. To highlight the costs to consumers of regulatory overreach.
  • 3. To develop and defend the most effective means for government to remedy real harms—focusing on increased education, innovation in consumer empowerment tools, and better enforcement of existing laws. 
  • 4. To facilitate constructive, serious dialogue on technology policy through regular events.
It's convenient that they managed to come up with an argument that supports one of their preconceived notions.

Reading more about them, I actually find this position somewhat strange, as I actually do (for the most part) agree with their stances on CDA § 230 and the CFAA - but they seemed to have really missed the boat on net neutrality. It is just irksome to see such poor arguments as the ones advanced in the Wired op-ed come from a source that seems to have a lot of good stuff to say. I think it clouds their overall message.

Luckily, however, and in conclusion the White House and the FCC both seem to disagree with these guys. That is at least somewhat promising. Let's hope the next draft of FCC rules stand up to court scrutiny.

Update: Just wanted to post this to give a final bit of color to the absurdity of the whole debate we are having now: American internet is vastly overpriced.

† Note, however, their argument about video is poor. It is simply that because small content creators can leverage Netflix's massive infrastructure and deep pockets doesn't mean there is a problem is nonsense. They even point out a flaw in their own argument - that Netflix acts as a gatekeeper. Well, what if you create video and want to self distribute? Because, you know, Netflix is totally opaque about viewership stats and not everyone is thrilled about Google? If you find yourself an amateur artist and want to post video on your own site, hosted on your own server, without net neutrality enforced, expect your viewership experience to be totally unusable.