Sunday, January 28, 2018

Mobile Phone Use as a Proxy for Creditworhiness

I was chatting with a banker recently about the use of mobile phone behavior to assess borrower risk in areas where most people do not have credit scores or banking relationships. She was skeptical. I don’t blame her.

But many now believe that analysis of mobile data relationships, communities, frequency of communication and other evidence based on mobile phone use could, indeed, be used to assess credit risk.

There are many straightforward indicators of behavior that are plausibly related to loan repayment. A responsible borrower may keep their phone topped up to a minimum threshold so they have credit in case of emergency, whereas one prone to default may allow it to run out and depend on others to call them.

An individual whose calls to others are returned may have stronger social connections that allow them to better follow through on entrepreneurial opportunities.

As you would guess, such techniques are most valuable in the global South.

One obvious source of data is remittances received on a phone (M-Pesa, for example). It seems to make a difference whether contacts on a mobile phone include both first and last names, for example.

That bit of data can mean a 16-fold difference in default rates on loans. Micro-loan provider Tala analyzes mobile phone behavior such as the size of the applicant’s network. Consistency, like making a daily call to parents, and where a person goes daily make a difference.

About eight to 10 questions seem to be enough to establish a proxy creditworthiness score.

Certain behavioral patterns are remarkably accurate in predicting the probability of default among borrowers without formal financial histories, even for very poor borrowers whose mobile phone usage is extremely limited, according to studies cited by the World Bank.

Higher-risk borrowers used their phones infrequently, and were found to only place 22 minutes of calls and send one text messages, spending a total of $2.85 over a period of 11 weeks.

Individuals in the highest quartile of risk were six times more likely to default than those in the lowest quartile.

A bank that participated in a study found it could eliminate 43 percent of loan defaults by eliminating the 25 percent of people who are most risky.

At one level, this chart only illustrates the fact that developed nation citizens have more income, cash or wealth than citizens in developing nations.



Likewise for digital payments, citizens in developed markets tend to use such mechanisms more than citizens in developing nations.

Saturday, January 27, 2018

Federal Preemption Coming in Internet Access Business?

Communications that cross state lines generally have been regulated differently than communications that are confined within a single state, or parts of a state. In the internet era--even if data communications tend not to be regulated very much--there has been a “hands off” approach, which fits the generally highly-distributed nature of modern computing.

In more recent times--in the wake of the Telecommunications Act of 1996--there was a perhaps-necessary clarification of state and federal roles, mostly in the area of federal preemption of state and local rules.

The logic has been that, for clear efficiency reasons, it does not make sense to have potentially 50 sets of rules for communications that are, almost exclusively, interstate or global in nature.

It seems almost inevitable that we will have some form of the federal preemption debate as policy on internet access potentially fractures with imposition of state rules on internet access. AT&T, for example, already has started calling for federal rules to re-establish or preserve a single national policy.

That comes as some states and localities create their own policies for internet access, once again raising the issue of fractured policies across the nation. As those of you who work in tariff and taxation areas know, it is devilishly-complex to comply with all local and state regulations when you are running a nationwide business.

That, in fact, is behind the whole European Union project: ending the friction that comes with multiple regulatory and currency regimes within what increasingly is a single market.

“It is time for Congress to end the debate once and for all, by writing new laws that govern the internet and protect consumers,” AT&T says.

Given the obscurity of network neutrality in general, and its weaponization, it might be reasonable simply to point out the areas where nearly everybody continues to agree.

We all agree that people and consumers must be able to use all lawful services and applications. There cannot be blocking of lawful applications.

Such applications cannot be throttled or downgraded based solely on the ownership of specific sites and content.

Everyone has agreed on these principles for more than a decade. So, even if most seem not to understand, do AT&T and other major internet service providers.

“We don’t block websites; we don’t censor online content. And we don’t throttle, discriminate, or degrade network performance based on content. Period,” AT&T says.
But “Congressional action is needed to establish an ‘Internet Bill of Rights’ that applies to all internet companies and guarantees neutrality, transparency, openness, non-discrimination and privacy protection for all internet users.

“Legislation would not only ensure consumers’ rights are protected, but it would provide consistent rules of the road for all internet companies across all websites, content, devices and applications,” AT&T argues.

At this point, and ironically, it is as much the major app providers--not just ISPs--that probably have to worry about what that means. If the objection to changing the “best effort only” level of consumer internet access is about preventing the emergence of gatekeepers, we have problems far beyond “who owns the access pipe.”

Actual instances of “commercial blocking” have been happening, but by Amazon and Google, for instance, not ISPs.

In the coming debate, the need for predictable rules, across the whole country, will be stressed, as we have seen in the past, and for the same reasons. To be sure, AT&T’s concern is about future services whose performance does matter, and which might clearly benefit from optimization, as do consumer apps whose performance is assured by use of content delivery networks.

Ironically, most larger content and app providers already use content delivery networks, precisely for the purpose of optimizing performance of their consumer apps.


“In the very near future, technological advances like self-driving cars, remote surgery and augmented reality will demand even greater performance from the internet,” AT&T says. “Without predictable rules for how the internet works, it will be difficult to meet the demands of these new technology advances.”

To be sure, the issue all along has not been “lawful use of apps” and “no blocking” but the development of quality-assured or other services whose costs are defrayed by an enterprise.

Some ISPs and app providers have argued for the freedom to offer “toll free” services--offered at no charge to end users--alongside the for-fee models. Internet.org, for example, has tried to create no-charge internet access programs for mobile customers in developing markets.

Some ISPs want the freedom to create toll-free or tariff-free services that provide internet access in the same way that toll-free calling is offered.

To be sure, business services are not covered by network neutrality rules. The problem is that the line between enterprise and consumer services increasingly is blurred. Virtual private networks, for example, can be used by business or consumer end users.

The fear in some quarters, perhaps logically, is that, eventually, quality-assured internet access becomes high-definition to standard definition video; or 4K instead of HDTV, a “better” level of service that eventually forces app providers to upgrade, possibly with the implication that app providers pay money to a transport provider, as already happens with content delivery network payments.

The point is that CDNs are lawful and routinely used. Why are CDNs "to the end user" not lawful? And if so, does that business require uniform national rules, given that CDNs almost intrinsically operate across state lines?

Friday, January 26, 2018

Most Firms, in Most Industries, Must Recreate Their Business Models

The internet has almost uniformly been positive for consumers--generating new value--while allowing some firms to ride new value propositions to huge business success. On the other hand, the internet has generally been difficult, financially, for nearly all incumbent firms.

“Digital is confounding the best-laid plans to capture surplus by creating—on average—more value for customers than for firms,” McKinsey consultants say.

Telecom service providers know the process well. A shift to over-the-top,  internet-based applications allows consumers to use product substitutes (WhatsApp, Skype, Netflix) instead of buying service provider products.

That both makes telco markets smaller, and reduces revenue and profit potential for the amount of consumer demand that remains. At least where it comes to intangible or software products (voice, messaging, content, apps and features), the cost of incremental usage is close to zero.

Prices become much more transparent, while new alternative suppliers emerge to provide lower-cost or free substitutes.

In other words, as with most other industries, use of direct internet distribution reduces the need for, and value of, intermediaries and distributors.

To the extent that the marginal cost of supplying the next unit of any product is nearly zero, retail prices will trend toward zero. But the problem is not exclusively faced by telcos.

Internet-based competition has “siphoned off 40 percent of incumbents’ revenue growth and 25 percent of their growth in earnings before interest and taxes (EBIT), as they cut prices to defend what they still have or redouble their innovation investment in a scramble to catch up,” McKinsey argues.


The point is that telcos and other internet service providers necessarily must replace legacy businesses and products with new business models and products.

That is why some of us believe retail service providers (business-to-consumer) must move up the stack. The incumbent business models are breaking down.

Suppliers in the business-to-business segments of the market might have other constraints or opportunities. It is hard to see how most capacity suppliers, for example, actually can move “up the stack,” though all such firms now have moved from a “voice capacity” to “data capacity” revenue model.

The arguably more-important growth has mostly been “new geographies” or “new and redundant capacity in existing geographies.”

Building or acquiring new routes outside the current footprint are an example of the former. Building new cables across the Pacific or Atlantic oceans are examples of the latter strategy.

The main point is that virtually every business faces similar challenges in making the transition from legacy to next-generation business models.

S&P 500 4Q 2017 Telecom Earnings Uniformly Below Expectations

It is just a snapshot, but the telecom segment of the Standard & Poors 500 Index faired absolute dead last among industry segments where it came to earnings that were above expectations in the fourth quarter of 2017.

Perhaps no single market is experiencing greater shocks than the Indian telecom market, as rapid consolidation is following dramatically-lower earnings. Vodafone saw a 39-percent drop in profits in the first half of 2017. Bharti Airtel profit dropped 39 percent (77 percent consolidated net profit) in the third quarter.

In Europe, it appears that profit is stabilizing, if there is little revenue growth.

Source: @FactSet

Most Big Data Projects Fail to Some Extent

According to Resulticks, only 21 percent of marketers say their big data software delivers on all its big data promises. About 52 percent of surveyed respondents believe big data projects  deliver “some” of what vendors promise.

That is not a new story, for virtually any type of enterprise computing initiatives. Few big new initiatives actually succeed on the level originally promised. Most likely fail outright.

According to some studies, enterprise “digital transformation” success rates have been as low as 13 percent.

That reflects the larger story that major investments in new technology platforms have tended to lag in producing measurable gains in productivity, sometimes for a decade or more.

That seems to be the broader pattern for some systemically-important technologies such as electricity, steam power, internal combustion engines and other general purpose technologies.

That also has tended to be the trend when enterprises have invested heavily in new computing technologies. There are many theories about “why” the pattern exists. Some think the problem is that we cannot measure the changes.

That is unsatisfying, so many believe the issue is that technology platforms deliver measurable advantages only after business practices are reimagined and refashioned to take advantage of the new technology. Time after time, we have found that big new investments in new technology do not produce measurable results for a decade or even more.

If that was routinely expected, nobody would make the investments. So the expectation is that the payoff will come within three years. Measurable value creation takes much longer, generally speaking.  

Tuesday, January 23, 2018

How Many "IoT" Devices Already are in Use?

Is it possible there already are as many as 27 billion internet of things devices already in use globally? Most of us would say “no way.” But it all depends on the definitions one uses for “internet of things.” Some definitions arguably are too broad.

For example, there is a difference between “connected devices” and “internet of things” devices. There might be 16 billion mobile phones and PCs--all “connected”--in use in 2017. But that seems to stretch the definition of IoT too far.

Using a more narrow definition, where IoT does not refer to mobile phones, PCs, tablets, IoT would include all manner of sensors other than phones, PCs and tablets that communicate. Using that narrower definition, there might well be as many as 10 billion IoT devices already in use, including more than four billion industrial and commercial sensors. Medical devices and sensors used in transportation also might represent about a billion more IoT sensors.



Sunday, January 21, 2018

Reliance Jio Earns "Profit" in Less than 2 Years (Arbitrage, Accounting Rules Help)

It has been two decades since I’ve seen anything like the apparent regulation-assisted business model changes that apparently have helped Reliance Jio earn a profit within two years of launching its attack on the India mobile market.

The profit also is based on accounting rules, as Jio still has negative cash flow. In other words, Reliance Jio is able to capitalize some operating expenses.

Still, it is fair to note that some regulatory changes have simultaneously harmed Reliance JIo’s biggest competitors, and helped Jio reduce its own operating expenses.

The last time I saw this sort of regulatory arbitrage was back pre-2000, when incumbent and upstart telecom firms sparred over reciprocal compensation fees paid to firms for terminating calls on their networks from other service providers.

Basically, because such fees were very generous in a few locales, long distance conferencing services started businesses in those areas, charging very-low calling fees and essentially making their money on the earned reciprocal compensation fees paid by the calls inbound to the conference calling centers.

The same idea was used by call center operations, where most of the traffic, by definition is inbound, rather than outbound.

The same arbitrage could be used by dial-up internet access providers, since--again by definition--the customer traffic was inbound from other networks (customers dialing in to create an access session).

Essentially, disparities in traffic flow also underpin the economics of rural and other small telecom companies as well, where long distance calls (disproportionately important in rural areas) generate an originating access fee that is paid by the long distance carrier to the originating call network.

The point is that, at crucial times, regulatory arbitrage can provide a bit of breathing room while erstwhile upstarts sprint to gain market share and reach sustainability. Arbitrage likely is not a sustainable strategy for Jio, anymore than it has proven to be sustainable for many other service providers.

But, at least in principle, such arbitrage can help in the formative years.

Saturday, January 20, 2018

FCC Definitions are Floors, Not Ceilings

Defining what broadband means now is an arbitrary exercise, if a necessary task to measure progress. According to the current minimum definition--on fixed networks--of 25 Mbps in the downstream, many internet access services actually cannot be marketed as “broadband,” using the Federal Communications definitions.

People, app experience and markets are not affected by any such definitions, of course. It probably does not matter at all that fixed network 10 Mbps Ethernet is not “broadband,” using the FCC definition.

The definitions do not apply to other wireless or mobile networks, though, a nuance that often is missed.

Still, for most users, it does not matter that most of their Wi-Fi and mobile internet access sessions are not “broadband,” using the fixed network definition. What matters is that user experience is good enough to provide satisfactory interactions.

“Satisfactory” often hinges on the actual use case, of course. Relatively modest speeds are required for most consumer apps, including video, somewhere between 5 Mbps and 25 Mbps. “Twitch” gamers mostly will need more.

Also, floors are not ceilings. Availability is not usage. In fact, U.S. consumer internet access speeds double about as fast as Moore’s Law would predict, and grow by an order of magnitude about every five years.

By some measures, U.S. average speeds are in the range of 19 Mbps. By other tests, even mobile access speeds are in the 23 Mbps range. Some other tests show 2017 average speeds of 55 Mbps.  


Though we tend not to pay much attention, U.S. fixed network internet access speeds used by consumers have grown about as fast as Moore’s Law would predict, at least on cable TV networks.

Cable One Offers Gigabit Internet Access to 95% of its Passings

Cable One’s “GigaONE” gigabit internet access service is now available to residential customers across more than 95 percent of its U.S. footprint, representing more than 200 communities.

The primary impact likely will be that more people buy access at lower speeds, ironically. The reason is that when gigabit services are offered, the price of lower-speed tiers tends to drop. And, as you would guess, consumers buy more of a product they like when the price is lower. Verizon, for example, introduced its new gigabit per second at a retail price half that of the former 760-Mbps service, for example.

Gigabit services launches tend to reduce prices of services in the 100 Mbps or hundreds of megabits per second range to drop about  $27 per month, or about 25 percent, according to an Analysis Group study.

In markets where gigabit service has been introduced, prices for internet access in the 25 Mbps and lower speeds also tend to drop, by 14 percent to 19 percent.

Likewise, when two providers sell gigabit services, prices for that service tend to decline by $57 to $62 per month, or 34 percent to 37 percent less.

Actual revenue upside might also be complicated. On one hand, gigabit sells for a higher price. But gigabit availability also tends to mean prices for lower-speed tiers fall. So net incremental revenue is tough to evaluate.

Take rates are part of the equation. Some believe adoption of gigabit services could range between five and 10 percent, in markets where lower-speed tiers also are available.

"Price anchoring" is the reason most consumers able to buy gigabit internet access will not do so. Price anchoring is the tendency for consumers to evaluate all offers in relationship to others. As the saying goes, the best way to sell a $2,000 watch is to put it right next to a $10,000 watch.

Anchoring is why "manufacturer's suggested retail pricing" exists It allows a retailer to sell a product at a price the consumer already evaluates as being "at a discount." Price anchoring is why a "regular price" and a "sale price" are shown together.

In the internet access business, price anchoring explains why gigabit access speeds are priced in triple digits, while low speeds are priced in low double digits, while the tiers most consumers buy are priced in between those extremes.

Service providers who sell a range of internet access products differentiated by speed and price might “typically” find that a minority of customers actually buy the “fastest” tier of service. That is largely because of price anchoring.

People often evaluate a "best quality offer, at highest price" one way against the "lowest quality offer, at lowest price, before concluding that the "best" value is the mid-priced quality, at the mid-tier price.

That was true in the past when the top speed was 100 Mbps as well. Most consumers did not buy the "highest quality" offer, whatever it was.
So it can be argued that gigabit internet access speeds have complex effects on internet service provider business models. Most customers will not buy the top speeds, but will upgrade to faster tiers of service. At the same time, prices generally fall, on a “cost per Mbps” basis.

Consider that Comcast internet access average revenue per account is about $40 a month. Given that Comcast gigabit offers, where it faces little competition, are as high as $160 a month, and perhaps as low as $70 where Comcast faces gigabit competitors, that $40 average suggests uptake of the fastest tiers of service remains less robust than some would imagine.

Against that ISPs must balance the capex to build the faster networks, as well as evaluate the upside from any new apps and services that might be enabled by the faster networks, top speeds or rising average speeds.

The new wrinkle is that ISPs often make gigabit service available in neighborhoods where demand is highest. Doing so might lead to 30 percent take rates in those neighborhoods, as AT&T claims.

On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...