I've never had a bad experience with Amazon or any of their third parties across hundreds (!) of transactions. Long time collector, I purchased two pieces of art yesterday via the beta. Known gallery, brick-and-mortar. It's already a total debacle.
Late 90s press quoted Bezos as saying he wanted Amazon to sell everything except livestock and gasoline. Compared to art galleries, there's less bullshit and more ethics among the beef and oil companies.
But I pushed the button yesterday specifically knowing Amazon had my back. That says a lot.
Nothing that's Amazon's fault. Amazon purchase history shows order complete, paid via Amazon credit card. Gallery currently refuses to ship either piece.
Verdict: yes, Amazon has successfully brought the magical experience of dealing with art dealers online. Not sure earning their cut is going to be much fun for them, but like Prime I'll happily use it as long as they offer the implausible service.
Thank you for having "developer at artsy.net" in your bio as I would have found this continued questioning odd otherwise.
The gallery cancelled the order claiming both pieces are out of stock. A direct email to the gallery said the pieces were in stock. Typically gallery scam nonsense.
And literally while I was typing this, Amazon called me, totally unprompted.
So there's your verdict. Art galleries still suck, Amazon is still awesome, and it's still virtually impossible to make a significant purchase on the internet in 2013.
Thank you for this. Onc clarification: the "Bieber YouTuBe upload" takedown scenario remained a DMCA issue. The non-commercial = non-felony language was actually made even more explicit under SOPA.
That was actually one of the key points of SOPA -- it extended DMCA-style service provider protection to payment providers and ad networks. As for the "streaming" provision, that was (inartfully) meant to penalize the day 0 crowd, especially for unreleased media. It was not intended to penalize people downloading, watching, or what people who read this site consider "streaming."
The language could and should have been cleaned up then; it all got lost amidst the SOPA screaming. Penalties for "streaming" are covered by the DMCA inside the US. It's how YouTube and UStream can exist. And I think we've seen precisely how many felonies and how much jail time resulted from cover songs and Justin Bieber lip dubs since that law passed 15 years ago.
But right now someone outside the USA can take another party's shady (or well-intentioned stream) of a pay per view, wrap ads from an ad network around it, and accept PayPal in order to see the "unrestricted stream" (scam). And there's no legal framework to deal with it.
That's obvious scammy theft regardless of where you stand on copyright. And all the "good guys" (ad networks, payment providers) remain liable whether they respond to takedown requests or not. SOPA let people take down the ads and PayPal links if they couldn't get the site down. It gave PayPal and Google the same kind of process and liability protection for payments and ads that YouTube already enjoys for video.
This needs to get fixed and techdirt, boingboing and others need to hire someone with minimal legal expertise (optimally Congressional litigation support experience) if they're going to keep hammering this for pageviews and anti-copyright cred.
For starters, it's ntoskrnl.exe. Dig out the original Windows NT stuff from the early 90s and you can infer Dave Cutler and team's original vision. The fact that Microsoft made hundreds of billions of dollars off it, layering all sorts of legacy chaos atop it, obscures the jewel at the core.
There's a third issue here, Tim: a perceived downward trend in the quality of the offerings of O'Reilly Media. My first O'Reilly book taught me sed/awk. Now you print magazines with "How to knit a robot" on the front cover.
I'm all for throwing stuff at the wall and seeing what sticks, but you seem spread awfully thin.
O'Reilly hired a pal to speak at RailsConf based on a popular Web 2.0 site he built. Unfortunately he didn't build it. Also the site was written in C#.
Well, there are a lot of people who love Make: magazine. I've seen other comments that it's the best part of O'Reilly, which we lost when we spun it out in December :-) So tastes may differ.
I don't know about your Railsconf issue. Please send more details.
hi tim, it's not about taste, the "a perceived downward trend in the quality" is real. i own my career o'reilly. i have a cherished sortiment of 25 or more o'reilly books, some used and read so often that they more look like an original gutenberg bible than a computer book from the nineties/2000s. sadly i couldn't ad a new book to this collection for years.
for a very long time i believed that there was something like the O'Reilly (animal books) standard, that whenever one of your books is read front to cover you a) know more about the topic at hand than 99.9% the rest of the world and b) a deeper understanding of the topic.
while a) might still be true from time to time, b) is not true anymore -because the books are quite bad. and with bad i mean poorly edited (i.e.: the art of SEO, first edition and a lot more), completely un-structured (couchDB first edition), a scam (the one with the cow on the cover, it was the only book i ever did send back to amazon, just found it, it was the Data Source Handbook, 46 pages, 24 EUR, unbelievable poor "content") or just ... not a good book.
whereby i one stood in the computer book section, studied each oreilly book and decided what to learn this month, i now look into the other direction.
whereby i once recommended every dev-rookie every one of your book, i now point them to pragprog.
i believe you are a bussy man, but well, if you would from time to time pick up one of your books, read it front to cover and then ask yourself if this is really a book worth of having your name in front of it, this would already (probably) help a lot.
Why is this an invalid approach? If you have an agreed-upon OS installation, you've got at least several hundred megabytes of binary data from which you can extract sequences for a lookup dictionary. To compress, you locate largest substrings of the input that also appear somewhere in the OS data. For example, you might find that bytes 1000-2000 of the input data are the same as bytes 4528-5528 of /bin/ls. Then you just need a program that can extract this byte range from /bin/ls and fill it in at the appropriate place.
Of course, it isn't a given that you will be able to find a sufficiently large overlapping substring in the OS binary data. And it may not be allowed to assume an agreed-upon OS installation, although it was OK to assume that gunzip was available. And finally, doing this sort of largest-overlapping-substring search could be very very slow.
The real key here is that the challenge is about compressing one specific random file. It's clearly impossible to write a compressor that can represent ANY sequence of N bytes in fewer than N bytes including the decompressor. But this challenge is constrained to reconstructing a single input file, not handling arbitrary inputs. If the input files are random, and you keep asking for them, there's a tiny chance you'll get one with enough repetition that even gzip could compress it. If you can key against a large set of reference data, your odds improve significantly.
So, I don't know if this is a practical approach given the restrictions of the contest, the cost of entering, and the expected payoff, but I would say that if it is plausible to use a large body of OS files for lookup, this could be a winnable challenge.
Because of information theory. It's believed that the binary expansion of pi contains all possible finite bit sequences. A program that expands pi is relatively small. Assuming that hypothesis is true, does that mean we could write a compressor that simply breaks input into blocks and then encodes the output as indexes into pi?
And the answer is that we certainly could write that program. And the result would almost always be a larger output than input. Why? Because you'd have to search so far into pi that the index would contain more bits than the input blocks. In short, having a library of arbitrary strings to draw from does not help you to compress data.
More generally, compression is impossible in the general case. Every lossless compression algorithm will, on average, produce an equal or larger output than its input.
Of course, it isn't a given that you will be able to find a sufficiently large overlapping substring in the OS binary data.
It seems that it's a given that you will not find that substring. The larger it is, the less likely you will find it and the less likely it will repeat in the uncompressed file.
Unlikely but not impossible. Note that the substring does NOT need to repeat in the uncompressed file; it just needs to be possible to say "copy bytes X through Y of /bin/ls here" in fewer than Y-X bytes. I'd love to know what the odds really are, but it seems like you have some good options for improving them (e.g. ask for a really huge input file).
> it just needs to be possible to say "copy bytes X through Y of /bin/ls here" in fewer than Y-X bytes
It is very unlikely that the length of the sequence will be smaller than the space required to store X. Your plan is similar to saying "The input could contain long repeated segments containing only zeroes. Compress those using scheme Z. The longer the file, the more likely this is to occur.". Information theory allows us to prove that this cannot always work. In fact it is very unlikely to work for a specific randomly generated target file.
So by saying 'cannot _always_ work,' and 'very unlikely to work,' you are implicitly conceding the point- that it is possible. The question then becomes, given the size of the OS environment space (and the quality of the randomness in the originals), could one in 50 attempts actually succeed in shaving off a byte? I sincerely doubt that Mike, despite his superior knowledge of information theory, actually took the time to figure out just how good or bad the odds were. He assumed that it was fundamentally impossible, when it is in fact not impossible, just unlikely (to some unknown [to me] degree).
Let me just put it this way. It's more likely that you could win using gzip. It's similar to saying it's "possible" that the target file could have contained all zeroes.
Assuming the target file is competently constructed, the chances of winning wouldn't make it a rational bet for $0.01
On average, even if you have the specific file to work with, encoding those numbers, namely 1000 to 2000, will cost at least as much space as the lengths of the sequence you find. So it would be possible to do this, but the sequences would be so short, and offsets so large, that you'd end up losing.
One way to structure a compressed file is as a lookup dictionary. To decompress the file, you'll need to agree on which file is to be decompressed. If you need to have an agreed OS installation to act as a lookup dictionary before decompression can happen, by all rights you should include the size of the OS in your total of how large the compressed file is.
If you want to use an arbitrary OS, the "decompresser" would need to be able to identify the correct fragments. If you can do that, in less space than the original files, you've already managed to compress the data!
Click the Xamarin Dev Center link. You'll see Android and iOS but no Linux. They're focusing on mobile client tools.
They never got the full stack running on the server and they punted most of the Windows-specific client stuff from the start.
They landed on a super smart subset and seem to be kicking ass with it. A C# compiler with some odd omissions and cool enhancements + native bindings to iOS and Android equals a damn useful tool. If you're building .NET or even Java backends it's certainly a very sane way to hook into them from Android phones and tablets in the enterprise.
But it's not a cross-platform .NET environment by any stretch and certainly isn't on the path to becoming one.
I hadn't heard this story before, but time isn't kind to this particular conspiracy. The moment of clarity: Mr. Curry wanted to sue Microsoft but "couldn't find a lawyer willing to take on the case" -- in 1998. EVERYBODY was suing Microsoft in 1998, including multiple governments. If you couldn't find someone to sue them that year, I'm afraid you don't have much credibility.
And this made my head hurt:
"All computer security systems begin with the Intel processor itself," Curry said. "I helped Intel develop their processor, so I know how they work and how vulnerable they can be if left exposed." ... "In fact," he added, "Microsoft NT 4.0 is the least secure of all the NT versions... Processors on Windows NT Version 4.0 are insecure because they have been designed to automatically open the processor up to accept commands on start-up."
I love how everyone is an instant expert on the Internet, even if they have only heard of the issue minutes before. I'm not a random internet conspiracist. I'm an established member of this community reporting what happened to someone that I considered a friend at the time that it happened.
Here is the story as I remember it.
The private lawsuit that Ed Curry had standing to bring was a complex contract violation between himself and Microsoft. The fact that Microsoft was not carrying through with their obligations left Ed Curry with very poor personal finances. Therefore any lawyer who took the case on would be doing so on contingency. No matter how many other lawsuits may have been filed, it is not a particularly easy matter to find a lawyer who is willing to spend years in a private lawsuit against pockets as deep as Microsoft's in the hope that someday, maybe, you'll get a big enough settlement to justify it.
So what were Ed Curry's other options?
Well he was aware that Microsoft was breaking the law in a rather egregious way. Windows NT 3.5.0 service pack 3 had a C2 certification. Ed knew this, he is the person who had done that security evaluation. (Which he did on the very contract that Microsoft was breaking the terms on.)
However Microsoft was advertising that Windows NT 4.0 had a C2 clearance. And was selling that into government departments whose regulations required that clearance. Ed Curry was aware of the false advertising, and the lack of clearance, and was furthermore aware that major design decisions, such as putting third party graphics drivers into ring 0, made the attack surface against Windows NT 4.0 sufficiently large that it could not qualify for C2 certification. (Historical note, Windows NT 4.0 never got that certification. But many years later, on service pack 6, they got a British certification that they claimed was equivalent.)
But what could he do about that? Microsoft was clearly breaking the law. But as a private individual, Ed did not have standing to sue Microsoft for the false advertising. He's not the wronged party, you need someone like the attorney general to sue. But Microsoft was politically connected, and getting those people interested is difficult.
What Ed decided to do - in retrospect it was clearly a mistake - was to go public with Microsoft's lawbreaking in the hope that he could get the attention of someone sufficiently highly placed to force Microsoft to follow the law. That's when Microsoft went nuclear. They paid every one of his clients to go elsewhere. After his company went bankrupt, when he got a job they paid that company to preemptively fire him. After several months of this, he died of a heart attack.
Incidentally you may wonder why Microsoft broke their contract with him in the first place. The reason was simple. They came to him with NT 4.0, and said that they wanted C2 clearance. He came back and said that it would never pass, and explained why. They told him to lie so that they could get the certification. When he refused to lie, they decided that they would punish him for failing to cooperate, and decided to not live up to their side of the agreement, safe in the knowledge that he was not going to have a reasonable chance of successfully suing them for it.
That's what happened, and I don't much care whether you happen to believe it. I was there, you weren't, and people who are active on HN will make up their own minds about me.
I knew Ed Curry and worked with him at his home north of Austin for some reporting I did regarding bugs in the Cyrix CPUs. He was a friendly and kind-hearted person, was deeply devoted to both his religion and family. With respect, however, he did not have the best of business judgement. I spoke with him during the time he was setting up his NT certification business. I do not recall all the details, but even today I remember feeling uneasy that he was investing so heavily in creating a business for C2 certification before demand had proven itself. The alarms were going off in my head. I really think that Ed read a lot more into the relationship than he had a right to do.
And here's where I flash pocket aces: I sat in a room with no windows and no computers, across from men with strong chins and short haircuts, reviewing Windows NT source code line by line. On friggin' paper.
Never heard of this guy. Never heard this story. It makes no sense, and I cannot even imagine what "automatically open the processor up to accept commands on start-up" means.
Mr. Curry eventually met with senior NSA/DoD officials, aired what he had -- while a major government lawsuit against Microsoft played out -- and nothing.
Also, Windows NT 4.0 very much did get C2 certification and had E3 (equivalent but not transferable) at the time. Which again doesn't help the story in hindsight.
I mean, seriously... read this nonsense (gcn.com). This stuff doesn't even qualify him for a Wikipedia entry. It's just the story of someone who cracked under the pressure of releasing a version of NT every year for four years straight. He certainly wasn't the only one.
-----
Curry also gave Schaeffer an updated document pulled from Microsoft’s Web site. Under a section of frequently asked questions on security, the site answered the question: “Is Windows NT a secure enough platform for enterprise applications?” by stating that the company recently enhanced the security of NT Server 4.0 through a service pack.
“Windows NT Server was designed from the ground up with a sound, integrated and extensible security model,” the Microsoft Web site said as late as last week. “It has been certified at the C2 level by the U.S. government and the E3 level by the U.K. government.”
Hodson said the passage claiming C2 certification cited by Curry refers to NT 3.5 with Service Pack 3, which is the only version of NT to meet the NSA’s C2 level requirements to date. But because the passage earlier mentions NT 4.0, Hodson said, the meaning could be misconstrued.
Interesting. On Microsoft's own site they have http://support.microsoft.com/kb/93362 which does not list 4.0. But I found several references claiming that they did achieve C2 certification with service pack 6 in early 2000. My memory had that as a British certification that they claimed was equivalent, but Google is not turning up anything that supports my memory.
However that said, by the time they got that many service packs out, it was clearly no longer the same operating system that they were pushing in 1995. There will never be proof either way, but my belief is that the reason that it took 6 service packs before that certification happened is that there were real security flaws in early NT 4.0.
As articles like http://www.wired.com/science/discoveries/news/1998/05/12121 make clear, Ed Curry's claims were serious enough to be reported in the press at the time. And governments are large and diverse enough that there is no reason to believe that the opinions of people pursuing an anti-trust case about browsers would have much impact on people. This qualifies as a lot more than "nonsense".
As for your "pocket aces", I have absolutely zero clue who you are or whether you're telling the truth. I have no reason to doubt that people who would have been reviewing that code would find themselves on Hacker News. Obviously if you were working for the NSA, you wouldn't be likely to be inclined to leave a traceable trail all over the internet demonstrating that fact. However you wouldn't necessarily know everyone else involved. Nor after 17+ years can any of us claim perfect memory of everyone we might have worked with.
But I did know Ed somewhat. My impression of Ed, and the impression of many others we both interacted with, is that he was a credible witness. I never encountered any evidence that indicates that he was lying.
Yes, they list the advice in the article as applying to NT 4.0. And the advice on access controls does apply there.
But the only sentences stating that specific versions have actually received C2 type certifications are in the summary. And the statement there is that 3.5 was certified as of 1995 in the USA, and 3.5.1 was given a E3/F-C2 rating in the UK. Nowhere in that article does it say that any version of 4.0 ever received C2 certification.
If you think I'm missing something, please quote directly from the relevant section of the article.
"SAIC's Center for Information Security Technology, an authorized TTAP Evaluation Facility, has performed the evaluation of Microsoft's claim that the security features and assurances provided by Windows NT 4.0 with Service Pack 6a and the C2 Update with networking meet the C2 requirements of the Department of Defense Trusted Computer System Evaluation Criteria (TCSEC) dated December 1985." [1]
Anyway isn't all of this missing the point that the TCSEC C* requirements didn't really amount to much anyway? It's a pity no general purpose operating systems were ever evaluated to A1 criteria, and that that the Common Criteria haven't lead to systems like EROS/Coyotos/Capros receiving more development attention.
Also: MVC2 runs under NET 3.5 which doesn't even have the dynamic keyword. (I don't use dynamic in MVC3 or MVC4 either...)
The "stringly typed" (magic string) stuff was always avoidable. Regardless, see the [CallerMemberName] annotation and others which solves it back to INotifyPropertyChanged.
Now that the backlog of Microsoft tools have shipped, the scaffolding makes a bit more sense. The MVC team released multiple versions (open sourced!) instead of waiting for VS11. Which actually lines up with your core argument.
Late 90s press quoted Bezos as saying he wanted Amazon to sell everything except livestock and gasoline. Compared to art galleries, there's less bullshit and more ethics among the beef and oil companies.
But I pushed the button yesterday specifically knowing Amazon had my back. That says a lot.