Broback's Blog

Broback's Blog header image 1

How to Get Twitter Followers — NOT. “Faker Score” May Reveal Purchased Fans

August 28th, 2012 · No Comments · Social Media

220px Greatest Hits2 Milli Vanilli albumFor those who help various organizations evaluate their real and potential online audiences, it’s always nice to find a promising new service that can help determine how much effective reach someone has online.

There’s been a lot of conversation of late in the political sphere surrounding how “real” candidates Twitter followings are, especially in a world where massive numbers of followers can be purchased.

I had the opportunity to test statuspeople.com, the “faker” detection service often referenced in these articles, and was impressed with what I found when the service was run against a very small, yet clearly defined Twitter account.

The other day a friend emailed me asking me to check out his twitter follower count. I was surprised to see it had surged by 7500 percent(!) His explanation for the thousands of new followers? Simple. “I bought ’em”. He had done so on a lark, just to test the service, and paid maybe $20 to garner over 10,000 new followers. Prior to this, he had carefully grew his followership — so I felt comfortable that about 13% of his now massive total follower count was “real.”

With this information, I ran to statuspeople.com to see what they thought of his followerbase. Here is what they came up with.

Faker score

They nailed it pretty well IMHO….

Are the brand-name social metrics companies missing the boat?

A few weeks back I attended a carefully crafted social media “influencer” event (one where I did not have a hand in formulating the guest list) Initially I was concerned that I didn’t recognize many of the attendees. My fear of missing the boat was quickly dissipated though. One “VIP” I didn’t know seemed to have impressive numbers — over 20K followers while only following a few score in return. At first blush, it seemed the analysis/campaign done via a household-name social metrics platform had found someone intriguing. After running this person through statuspeople.com, I came to a different conclusion:

Faker2

→ No CommentsTags:

Site Scraping? Needlebase R.I.P., Long Live Outwit Hub

August 10th, 2012 · No Comments · Uncategorized

FrinkAre you a Web data geek? Are you into scraping sites? If so, you may be one of the many people who rely on the highly-regarded Needlebase to help you in your efforts.

After reading about Needlebase on RWW, (very, very cool example of the potential here.) I spent many hours playing with it — and while I was able to do some interesting/fun things, I finally gave up on Needlebase because of several issues:

Byzantine interface. Think this. Built by engineers, for engineers. While I’m sure Professor Frink had no issues, mortals were no doubt lost much of the time.

Latency. Since it was a Web-based service running in the browser, and required clicking to page after page after page to set set up a sequence, it always felt like I was running in molasses. (Might have been more of a Safari issue than a Needlebase issue…?)

Constant denials. Since it was a Web service, and sites hate getting scraped, sites learned to deny access to Needlebase. This is not Needlebase’s fault, BUT — you generally didn’t know until well into the process that your efforts were all for naught. (See interface issues above).

The bad news is that Google (who acquired the service) has now killed it.

The good news is that most people can happily get what they need using the excellent FireFox plug-in Outwit Hub.

I discovered Outwit Hub months ago and never looked back. Discovered I wasn’t a dope after all. Half an hour after installing it, I was happily extracting data from a myriad of sites.

Outwit Hub Works like normal humans would expect. Define fields, establish pre and post tag/html sets, enter a URL, and scrape away. No fool questions, denials by sites, or convoluted questions popping up. Since much of what’s happening is running locally on your CPU, the latency issue largely vanishes. Cheap too. Pay $35 one time, scrape forever.

R.I.P, Needlebase, I know there were a lot of people who relied on it and liked it a lot. I guess I’m glad now I just couldn’t warm up to it.

→ No CommentsTags:

A New Way to Think About What “Millionaire” Means

July 8th, 2012 · No Comments · Public Choice Theory

If you cash in your million dollar winning lottery ticket in Virginia like this man did, you can take it in installments of $40K a year for 25 years.

If you are a retired teacher in Illinois, your pension is (on average) $45K per year. Plus it will go up annually thanks to automatic cost of living increases.

Note that data from the U.S. Department of Education shows a median retirement age for public school teachers of 58 years. Given a life expectancy of 81, that’s 23 years of expected payouts (not including death benefits to spouses etc.) I am also not including pensioners health benefits nor the fact that these teachers contribute to their pension fund at a rate of 9.4 percent per year.

I’d love to see a real cash flow analysis of everything involved. Given this quick pass, I think it’s safe to say that there are definitely some (many?) public employees who are effectively lottery winners.

→ No CommentsTags:

A Little Perspective: Medicare Fraud vs Wall Street Bonuses, Profits

July 7th, 2012 · No Comments · Public Choice Theory

Found these numbers interesting.
Medicare Fraud? $60 billion a year.
Wall Street Profits? $13.5 billion. Bonuses? 19.9 billion.
Medicare vs wallstreet

→ No CommentsTags:

See You at The Corporate Social Media Summit SF!

June 18th, 2012 · No Comments · Events, Social Media

I don’t blog enough here about all the cool social data tools and services I use, but I frequently get a chance to demo my favorites on stage (and also blog about it at our Tweet House site.)

I’ll be in San Francisco on Wednesday at The Corporate Social Media Summit serving as a moderator, speaker and co-host.

I plan to describe and show off a lot of life-changing data services and applications such as Followerwonk, Tweetreach, Cloud.li, Tagcrowd, Topsy, DataSift, Gnip, and the amazing and versatile OutWit Hub.

As always, I’ll spend a little time showing off one of my favorite (albeit offbeat) tools FileMaker Pro. I love its ability to serve as a customizable web browser and site scraper — all while tapping into remote sql databases.

Here is more info about my session, if you want to sign up, save $300 by using the discout code “BROBACK”.
Session copy

→ No CommentsTags:

The Efficiency of Medicare: Real World vs Theory

April 29th, 2012 · 4 Comments · Economics

Efficiency is generally thought of the relationship between inputs and outputs. Economic efficiency is defined by Paul Heyne here.

“To economists, efficiency is a relationship between ends and means. When we call a situation inefficient, we are claiming that we could achieve the desired ends with less means, or that the means employed could produce more of the ends desired. “Less” and “more” in this context necessarily refer to less and more value. Thus, economic efficiency is measured not by the relationship between the physical quantities of ends and means, but by the relationship between the value of the ends and the value of the means.”

I’ve been meaning to respond to the many posts and articles that strongly assert that rising health care costs are best dealt with by more collectivization. The model for this collective effort is usually Medicare, and enthusiasts for the Blue model can usually conjure up data and anecdotal assertions that fit the bill as needed. Examples are trotted out that imply/signal efficiency like “great reports” etc.

Sadly, idealized signals are no substitutes for reality — and I recently came a concrete example of inputs vs outputs that went way beyond mere signaling or tortured data.

Imagine you are checking into the hospital for surgery and are evaluating the payment options. You ask the administrator what your choices are regarding paying cash or using your Medicare coverage. Which of the following two scenarios makes more sense to you:

1) The Collectivist Scenario:
You: “How much will it cost if I pay you with cash vs Medicare?”
Administrator: “Luckily you have Medicare coverage. Due to their negotiating clout and reduced administrative overhead, the regular cash cost of $11,000 is only $3,400, and with your 80% coverage, you only pay $680.00.”

2) The Market Scenario:
You: “How much will it cost if I pay you with cash vs Medicare?”
Administrator: “You paying now will greatly simplify things. Sparing us from the administrative hassle of the Medicare paperwork and the long wait to be reimbursed has value. A cash payment now would be $3,400. Otherwise, we will bill Medicare for an effective rate of $11,000 and you will pay the 20% of $2,200.”

Whatever your idealogical predilections, #2 is the reality of what I faced recently when I checked a relative into the hospital — and it’s not an anomaly. Play with Google a bit, and you’ll find a nearly inexhaustible set of accounts similar to mine. Help your hospital avoid the “efficiency” of Medicare and they will eagerly reward you. That’s IF they even accept medicare. As the NY Times reports, finding doctors who accept Medicare can be a challenge. As I learned, “reimbursement rates are too low and paperwork too much of a hassle.”

Efficiency? Let’s ponder. I think it’s safe to say price as a proxy for inputs is more than an accepted approach.

In terms of my personal inputs and outputs, Medicare is significantly more efficient than paying cash. I allocate 54% more cash inputs to garner the same output. Cash is only 65% as efficient as Medicare.

In terms of overall efficiency, things swing dramatically the other way. Cash inputs vs output now show the market route as being 3.23 times as efficient as the collectivist option.

Consider the societal impact for the pool of non-beneficiaries. Quite literally the difference is infinite. Zero payment vs $8,800. Given the incentive to save $1,200, most would be more than willing to impose a $8,800 cost upon that group of non-beneficiaries.

Welcome to Public Choice Theory.

→ 4 CommentsTags:

Krugman’s Laffer Curve Admission: Business Tax Rates Too High and Non-Optimal

April 29th, 2012 · 3 Comments · Economics

I recently added ABC’s This Week back to my Tivo since ABC took my veiled and tongue-in-cheek advice and jettisoned the talented but regrettably snooze-inducing Christiane Amanpour. (Great to have George S. back, but I’d love to see the awesome Jake Tapper in the host chair again.)

Today’s episode featured a mildly bullying interchange between Krugman and Carly Fiorina where he either completely misunderstood her point or purposely attempted to put misleading words in her mouth. In this process, he inadvertently validates the notion that our corporate tax rates are too high, and that the Laffer Curve is alive and well.

The topic of conversation was why U.S. corporations are moving their operations overseas. Check out this (edited) clip below:

Fiorina says our business tax rates are the highest in the world which is true (in the case of corporate rates):

Corporate tax rates for web

Krugman aggressively attempts to “refute” her point by oddly changing the subject to tax receipts(??). While he is accurate in his assessment that receipts are comparatively low, (at least as a percentage of GDP) this completely validates Fiorina’s main point about how businesses will aggressively pursue avoidance strategies (which include moving overseas!).

Decision: Fiorina

Bottom line. High rates induce companies to move overseas, which results in lower receipts. Make rates way too high and you’ll see an obvious disconnect between the two numbers. Sounds like a classic manifestation of the Laffer Curve to me…

→ 3 CommentsTags:

Dear Matthew Yglesias: Yes, Econ is Hard

April 11th, 2012 · 10 Comments · Economics

Economics is hard.

I was sent this piece written by Philosophy major Matthew Yglesias yesterday. In it he attempts to emulate thinking like an economist, and signals in a way that the untutored will likely find impressive. Phrases like “regressive”, “subsidy” and even an “at the margin” are sprinkled in liberally. (Strangely — and very conspicuously for this specific instance, he omits the well-worn favorite of progressive pseudo-economists, and the ever-present justification for market intervention: “externality”.)

Here is his core argument (emphasis mine:)
“…developers assume that there’s widespread demand for cable TV and modem hookups, so they provide them…right now, new construction projects are generally required to provide parking, in effect taxing households with a below-average quantity of vehicles in order to subsidize households with an above-average quantity…at the margin does Seattle need to subsidize extra parking?”

Those who have a distaste for cars, gratuitous mobility, or greenhouse gases will no doubt inherently embrace this line of thinking.

Those of you who have sat through a few econ classes and have pondered scores of case studies over several years will no doubt immediately see numerous holes in his logic. Let’s walk through a few of them:

Parking, Cable and Their Supply Curves: Huge Differences in Incentives for Developers to Provide Them

Inserting cables into walls as they are being constructed impose negligible marginal costs to the builder and absolutely zero opportunity cost (cables in walls don’t displace housing units) — SO DEVELOPERS “PROVIDE THEM”. (Economists universally visualize supply curves in conjunction with demand curves BTW.) Things aren’t made widely available just because people “demand” them.

More parking spaces absolutely means fewer sellable apartments and/or smaller/less profitable ones. Huge difference.

Result? Builders are far more averse to displace apartments with parking than they are to stick cables in walls. Naturally they will maximize dwelling units, and minimize parking.

Parking is a Congestible Public Good, Cable TV is a Private Good

Unlike parking, functional cable TV connections don’t universally sprout out from the street for miles around. Since on-street (and other) parking options exist developers will inherently prefer to free ride on that capacity in order to maximize their revenues. This means fewer than desired parking spaces and hello tragedy of the commons. (Irony alert here — I’m referring to an externality problem…) The logical solution is to ensure developers don’t impinge on existing public capacity, and minimums are the easiest way to do that.

Arnold Kling (Econ PhD — sadly, no Philosophy credentials) deals with this here:

“Ah, but there’s the issue, don’t you see? How do you deal with the Coasian bargaining issues? Suppose somebody wants to put a big apartment building in my neighborhood, without providing parking, creating major inconvenience for those of us who no longer will have street parking available. How do we arrange for the developer to compensate us, or for us to pay the developer to provide us with parking?”

People Still Universally Drive Cars (and Need to Park Them)

Unicorns and rainbows aside, Seattle dwellers have not abandoned their automobiles for streetcars and fixies. If you are a Seattleite renting an apartment, you have a car and need to park it somewhere. Yglesias asserts that 16 percent of households don’t have cars — I don’t believe it. Given the still-ubiquitous nature of cars and the incentive for developers to not provide adequate supply, mandated minimums are appropriate.

Parking is Fungible

A “below-average quantity of vehicles” problem? Did you rent an apartment that includes a parking spot? Are you that one guy without a car? Are you a victim? No. See a doctor, and if he can’t cure you of your freedom/mobility-aversion, rent the space out to someone else. Problem solved.

“Extra” Parking?

“Extra”? Mandates provide parking “extra” parking above what’s needed? Based on the evidence above, try “enhancing supply in an attempt to better align to a proper market equilibrium” instead.

Have any readers tried to park in Seattle recently? Does it seem like there is “extra” parking or a ton of untapped capacity due to these regressive governmental mandates? Anyone? Bueller? I didn’t think so…

→ 10 CommentsTags:

My Take: Supreme Court Justices are Human and Affected by Incentives

March 28th, 2012 · No Comments · Economics

It appears that Obamacare is facing an uphill battle in the Supreme Court:

LA Times: Supreme Court greets healthcare mandate with skepticism

Chicago Sun Times: Justices’ queries could signal trouble for health-care law

CBS News: Supreme Court majority skeptical on health care law

Washington Post: Supreme Court expresses doubts on key constitutional issue in health-care law

While unconstrained thinkers like Nancy Pelosi find the notion of a constitutional challenge laughable, the concept of the commerce clause having limits seems logical to this person of constrained vision.

A few evenings ago I mentioned to some friends one other reason I felt Pelosi was off-base, and that Obamacare might be in trouble: Incentives. Imagine that you are a Supreme Court justice. Does it seem rational (given how the constitution is written) that you would take a position that would result in a significant reduction in your ability to direct future outcomes? Why would you rule that the commerce clause has no limits? Seems to me that would just mean you would rule yourself into irrelevance.

You don’t need to be a professor of constitutional law at Harvard to predict that Obamacare and the SCOTUS might be on a collision course. In fact, it may be best if you’re not so academically immersed.

As Charlie Munger says:

“Never, ever, think about something else when you should be thinking about the power of incentives.”

Related: Sowell refers to judges and incentives here.

→ No CommentsTags:

Goodheart’s Law

March 18th, 2012 · No Comments · Goodheart's Law

I’ve added Goodheart’s Law as a category. It’s well worth noting:

“Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”

Something to consider whenever you hear the word “targeting”.

→ No CommentsTags: