Wednesday, February 6, 2019

Pricing Models in Precision Agriculture (IoT-based Services)

Revenue models for precision agriculture (use of internet of things sensors, apps and services) might not be directly applicable in the connectivity business, but will have to be understood if service providers decide to explore roles in distribution (sales channels) of such services.

Retailers who offer precision agriculture programs often charge by the acre, per instance, percentage of yield or per unit sold.

That roughly corresponds to connectivity business pricing models based on flat rate, pay per use, revenue sharing or usage volume.


Pricing Model
Advantages
Disadvantages
Bundled Per Acre
Flat per acre rate with all “precision services” offered by the retailer included in the plan.  
  • Ease of billing
  • Farmer knows cost up front
  • Separates customers into two separate groups (participating in precision program or not).
  • Allows for easier analysis of yield trends within each of the two groups.
  • Allows group data analysis to be presented to entire group in the program to improve production within the group.
  • Likely to create a closer relationship for all inputs with the grower (less shopping around).
  • Easier to build in a call center/service center fee.
  • Large upfront commitment for producer
  • Potential to be an expense that is cut during low profit margin years
  • Each additional service outside of package requires surcharge
  • Difficult to “dabble” with just a few acres.
  • Farmer usually must commit to all or none.
  • Machinery limitations of grower may not allow full implementation of program
  • Accurate data analysis requires trustworthiness of grower and operators to properly adjust/calibrate equipment
Per pass or service
Each pass across the field or individual service has a price associated with it.  
  • Able to charge a higher total price vs. bundled model due to less “sticker shock”
  • Allows producers to pick and choose their individual needs
  • Allows company to analyze margins of each individual service
  • Ease of entry/exit from use of specific services
  • Grower and retailer work together to develop best plan for each operation
  • Bookkeeping becomes more challenging as each service needs to be tracked separately.
  • Harder to prove benefits of the service as there is no “Large group” of producers doing same practices.
  • Requires higher effort from salesforce as each individual product or service must be “sold”
  • Easy for grower to say “not this year” on certain services
% of yield bump/gain
(example: a ten bushel/acre yield bump was added to a grower’s average and they share four of those bushels with their service provider)  
  • Places accountability with the service provider to deliver results.  “The saying put your money where your mouth is” applies here. Grower is more likely to commit as it is obvious that they only pay if they receive a yield gain
  • Potential for higher adaptation by grower and producers
  • Keeps service provider in closer contact with grower throughout the year as their paycheck depends on the grower’s success!
  • This is a unique approach, sure to make conversation as farmers talk to farmers, making for self-advertisement.
  • Delayed collections as retailer must wait for harvest to be completed
  • Determining baseline yield establishments
  • Weather variability
  • Measurement of yield (yield monitor, grain cart, certified scale tickets?), Is a yield monitor calibrated by the grower accurate enough?
  • Determining a market price for the grain
  • Risk of negligible or no yield bump equates to no payment to service provider
  • May “discourage” growers to shoot for higher yields
  • No consideration for reduced inputs.
  • Model may not be sustainable long term
Per unit of product sold
Service is bundled with each unit of fertilizer, chemical, seed, etc.  
  • Ease of bookkeeping
  • Services buried within the price of a retailer’s commodities
  • Spreads the overhead cost of precision ag professionals and technologies over all customers
  • Encourages growers to cut back on inputs
  • Easier for competition to undercut a retailer’s price if grower doesn’t see value of buried service charges
source: PrecisionAg

How do You Know a Process is Using Machine Learning?

Most of the time, the typical consumer or worker encountering any form of artificial intelligence is interacting with some process enhanced by machine learning. ML is based on algorithms and statistics to find patterns in massive sets of data.

These algorithms use statistics to find patterns in massive amounts of data. They then use those patterns to make predictions.

source: MIT Technology Review

Monday, February 4, 2019

AI is a Feature, Not a Product

Channel partner organizations in both communications and information technology industries already are trying to figure out what they can sell to business customers in the artificial intelligence area.

The problem is that, for most products, AI is an attribute or function, not a stand-alone product. As applied to information technology operations of various types, that means AI is a feature of a product, not the actual product that is viewed as the solution to a business problem.



Sunday, February 3, 2019

IS Cost of Internet Access a Problem? Yes and No

People sometimes complain about high prices for U.S. internet access. As a percentage of household income that is not true, in the United States or any other developed nation, but people believe the “high price” charge anyway.

According to the International Telecommunications Union, prices adjusted to reflect purchasing power parity across nations suggests mobile broadband cost perhaps $15.90, in 2015, in developed nations, compared to a world average of $26.70 and far lower than the $30.80 paid in developing countries or $39.90 in lesser developed countries.

Looking at fixed broadband prices, customers in developed countries paid about $27.80 in 2015, compared to the world average of $56.30, the developing country cost of $67.30 and the lesser developed country average of $134.

LIkewise for mobile broadband, which in developed countries costs one percent of gross national income per person, compared to the world average of five percent, the developing country average of perhaps 7.5 percent and the lesser developed country average of about 16.5 percent of GNI per person.


Population density and country size do play a role in the cost of providing service to customers. Large countries and countries with large rural areas will find high costs to serve customers in rural areas. In Canada, 90 percent of people can be connected using facilities that cover just 3.3 percent of the land mass.

In Australia, 90 percent of people can be connected using facilities that cover just 4.32 percent of the land mass. In the United States, connecting 90 percent of people can be connected using facilities covering about 31 percent of the land mass.

The point is that costs to connect rural customers will be quite high in countries with huge areas of sparse population.

source: Deloitte

Could, Should Telcos Invest More in Fixed Networks for Consumer Services?

Among the “in the hall” discussions I had at the PTC’19 conference was one extended discussion about whether U.S. telcos simply did not want to invest in better fixed network access infrastructure, or whether they actually could not do so, for business model reasons.

“They could do so, but do not want to” was one pole. The assumption here is that heavier access network investment would generate sufficient financial return to justify the spending.

“They cannot justify the business case” was the other pole. The assumption here is that revenue upside--while it exists--is not great enough to offer a payback.

That there is a challenge many would accept. “Wireless substitution and cable competition have taken a toll on most wireline carriers’ customer base, leading to challenging economics and limited funds for fiber deployment.” researchers at Deloitte have said.

Some argue that fiber-to-the-home networks have allowed at least some fixed network telcos to hold market share against their cable TV operator rivals. Still, on a national level, “wireline telecom carriers account for about 37 percent of consumer broadband customers compared to 63 percent for cable,” Deloitte notes.

And cable has been gaining more than 100 percent of all net new accounts for at least a decade. In 2012, telecom companies enjoyed 44 percent broadband market share.


Nor does it seem likely that telco investments in fiber to the home can do much other than try and keep up with cable operators, who keep pushing faster speeds, with a gigabit being the standard headline offer, and 10-Gbps speeds being on the roadmap, already. Even when fixed network telcos do deploy new FTTH, it is to support gigabit speeds that only match cable.

The other problem is simply that profits to plow back into the fixed network business are thin, and getting thinner. Having lost leadership in the internet access market to cable TV operators, telcos also have lost half or more of their voice accounts, and now are seeing erosion of their linear video business as well.

There is, in other words, no revenue growth sufficient to drive big FTTH investments. There are limits to how much market share can be gained back from cable, and the investments will not slow voice and linear video erosion.


AT&T became the largest linear video supplier in the U.S. market not by upgrading its fixed networks but by acquiring DirecTV.

To be sure, the additional and continuously-expanding bandwidth requirements are an issue. One way or the other, higher bandwidth must be supplied or the fixed network business eventually collapses entirely.

But it increasingly seems likely that the solution is not ubiquitous optical fiber but use of mobile and wireless access. Mobile is the way to supply consumer voice. Linear video is declining, and AT&T uses satellite, at least for the moment.

Cable TV operators have the lead in internet access, and given their own need to rely more on internet access as video revenues contract, they are not likely to allow telcos to win back extensive share in the consumer internet access market.

Fixed network telcos need to find some way to keep extending bandwidth, to be sure. The problem is that it simply is difficult to make the business case, anymore, for ubiquitous FTTH. The revenue upside simply is too small, compared to the investment.

Still, “more bandwidth” will be required. Either that, or fixed network telcos must find ways to exit the consumer fixed network business. In essence, that is what CenturyLink has begun to do. Its revenues now are generated mostly by business customers. About 70 percent of CenturyLink revenue comes from that segment of the business. All nationwide consumer operations generate only 25 percent of total revenue.

Increasingly, the value of the fixed network is driven by mobile backhaul and enterprise services. Smaller business and consumer revenues are thin, and getting thinner.

So, in the end, it might not matter which side of the debate one believes. The statement that telcos could invest more in fixed networks, but choose not to, is correct.

Whether that is because the business model is not there, or because the business model is better if they do not invest, is somewhat secondary.

Will 5G Change Fixed Network Business Model?

Do industry executives increasingly believe 5G will change the economics of the fixed networks business? This diagram suggests they increasingly believe that is possible. As 5G networks are built, today’s fixed network access (copper, fiber to home or hybrid fiber coax) will be faced with new competition from fixed wireless and mobile wireless alternatives.

That will happen as deep fiber architectures enable use of lower-power microcells and small cells with access to an order of magnitude more spectrum, in addition to better radios and using the frequency reuse principle (cell site division) to historically-dense levels.



By some estimates, 4G has lead to cell tower density of perhaps 2 kilometer (1.25 miles) spacing. Some believe the 5G network using millimeter wave spectrum will require small cells placed about every one-third of a mile. In other words, 5G using millimeter wave spectrum might require about eight times the number of transmitting sites, compared to 4G using towers spaced at about 2 km (1.25 mile distances).

That is probably a high-end forecast, assuming an equally-dense network in all deployment scenarios  in dense urban and suburban settings. Virtual nobody believes that is possible in rural areas.

Some might note that such densities, while perhaps common in more-dense urban areas, are not so common in suburban settings, and nonexistent in rural areas, for the most part. Also, the simple assumption here is that optical backhaul to one macrocell in point to point fashion also applies to more-dense networks.

In other words, if one site requires a single point to point connection, then 25 small cell sites require 25 total point-to-point connections. But density itself changes the topology, leading to “more tree and branch off a ring” topology that does require lots more fiber, but not as much as point-to-point links would require.

Still, the business model impact on fixed line network operators will possibly be significant. Think of 5G as a new overbuild operation. Where today a cable operator and telco compete for the consumer internet access customer, tomorrow it could well be the case that those two competitors face a third or maybe even fourth competitor for a market that is very close to saturated.

That means market share losses will happen. The degree of business model disruption then turns on the amount of share the new competitors can take, and from which current suppliers.

3G to 4G to 5G: What is Common?

In one sense, it might be easier to envision 5G primarily as a way to supply increased bandwidth for consumer mobility, as one might see 4G as a way to increase bandwidth supply for mobile internet access demand.

Ignore for the moment the lower costs per unit 4G offered over 3G, or the similar benefits 5G will offer over 4G. Ignore for the moment the lower latency 4G offered over 3G, or the lower latency 5G will supply, compared to 4G.

Look only at 5G as a platform for supplying increased bandwidth at lower costs per bit, as was true for 4G as well. So long as there is a transparent fallback from 5G to 4G (as was true of 4G fallback to 3G), and so long as 5G experience and 4G experience are matched closely enough, then 5G can be added more gracefully than some expect.

In fact, one good reason for marketing and supplying bandwidth more incrementally when adding 5G is precisely so the 4G fallback is graceful. In my own experience, the fallback from 200 Mbps or 100 Mbps to 20 Mbps or 14 Mbps is quite graceful, per concurrent user.

In fact, unless there are some latency effects, I typically cannot tell you whether my current connection is running at 14 Mbps, 20 Mbps, 100 Mbps, 150 Mbps or some higher figure. Granted, I do not download big files very often.

My typical use cases range from web surfing to cloud-based communications to streaming video. None of those apps is overwhelmed or unpleasant if my connection is anywhere from 14 Mbps to some other three-digit rate. In other words, if I switch from a cable modem connection to mobile 4G, my experience does not suffer.

The point is that, even if I buy a 5G service with higher headline speeds, defaulting back to 4G is not going to be a problem.

Saturday, February 2, 2019

AIOps Now Shows how AI is Applied to Network Operations

I just got back from chairing the new AIOps Expo, sponsored by TMCnet, a three-day event looking at the advantages and challenges of artificial reality as applied to management of information technology systems (both enterprise and communications service provider).

These days, it always is clear that value and opportunity for connectivity providers largely has shifted up the stack to applications and platforms.

One big question asked by enterprise buyers at the event is how “real” AIOps is, where and how it can be used today, and what the roadmap looks like.


As with all new buzzwords and trends, we have to define what we are talking about. According to Gartner researchers, “AIOps platforms combine big data and machine learning functionality to support all primary IT operations functions through the scalable ingestion and analysis of the ever-increasing volume, variety and velocity of data generated by IT.”

Putting the AIOps moniker into context, one might argue it is the latest way to describe the use of artificial intelligence (machine learning and deep learning more than neural networking at this point) to improve IT operations.


You can get an argument about where “automation” ends and “AI” begins; that was clear from discussions at the event. You will not get much argument that, at the present state of the art, it is much easier to apply AI to specific functions and processes than to integrate and correlate all functions and processes.

Compared to a decade ago, when the ability to analyze “big data” was the buzzword, insight still is the desired outcome. What seems different now is that Moore’s Law makes a difference.

Analysis that might have been just as valuable 10 years ago--or two decades ago--now is feasible because the costs of computation and storage as so much less.

But one new emphasis is on machine learning: allowing AI to work autonomously--with human approval--to modify system behavior based on what has been learned, without human intervention.

One concrete difference is the role of scripting and code writing. Compared to present practice, the goal is to allow machines to modify their own behavior without direct coding labor. That obviously raises clear issues about bias in the coding systems and security and privacy issues.

But a big strategic change is the shift to allowing machines to discover patterns that would be prohibitively expensive we were to attempt to discover patterns using human agents only.

One illustration of the potential benefit can be glimpsed if you ask why network operations centers have so many screens, as technologist Frank Yue, KEMP Technologies solution architect. The reason is that each subsystem has its own management and monitoring system.

That means fault isolation is more complex than it would be if all systems were correlated, if all the data could be analyzed and understood in ways that reduce the total number of alarms, for example. The cascading alerts NOCs have to deal with is itself a problem, as many of the alerts from different systems actually refer to common events, noted Bhanu Singh, OpsRamp VP.

Frank Yue, Kemp Technologies

In fact, as noted by Taly Dunevich, Ayehu global VP, AIOps is, in many ways, AIOps is the latest way to automate IT processes, without scripting or coding.

All of which lead Wayne Parker, Northrop Grumman technical advisor to quip that “in 10 years we’ll be calling it something else.” Indeed.


The Business Case for FTTH in the 5G Era

Could U.S. fixed network service providers--other than cable operators--make higher profits if they chose to dramatically increase investment in their access networks? Some might argue the answer is “yes,” while others will maintain the answer is largely “no.”

The essential argument is that telcos could deploy more fiber and thereby grow revenues, in addition to providing better customer experience. Naysayers might point to several reasons why that is likely untrue.

First, U.S. cable operators already lead in market share for internet access, have been adding more than 100 percent of net new additions for years--no matter what telcos have done--and already know how they will migrate the platform from gigabit to 10 gigabit capabilities, at costs far lower than the telcos can manage.

In that, that has been true in the entire internet access era: cable operator simply have been able to upgrade internet access speeds faster, and at much lower cost, than telcos have been able to do. Nor is there any particular reason to think that will change, in the fixed network arena.

That actually has been a key and ever-present issue in the competitive era.

Also, total fixed network revenues (telco and cable) are flattish to falling, as demand shifts to products that do not actually require use of the fixed network at all. Voice and linear video revenues are falling in both cable and telco segments of the fixed network business.

And there is every reason to believe that mobile substitution now is going to make itself felt in internet access and video as it has in voice.

Also, alternatives seem to keep popping up, ranging from overbuilders such as Google Fiber and other independent ISPs to coming use of low earth orbit satellite constellations and possibly other unorthodox platforms.

All of that means the potential return from a major investment in optical fiber access by a telco is getting harder, not easier. It will not help that mobile access now appears poised to erode yet more share from the fixed networks.

Verizon total wireline revenues, for example,  decreased 3.2 percent year over year in fourth-quarter 2018 and 3.0 percent for the full year, compared with 2017, to $7.4 billion and $29.8 billion, respectively.

AT&T fixed network revenues in the fourth quarter of 2018 likewise were down. That stagnating trend has been true for both firms, in the fixed network segment, for some years.
So to the argument that telcos could make more money if they invested more, once must look at Verizon, which already has heavily moved to fiber-to-home facilities. It has not helped much, apparently, though losses would have been greater had it not done so.
AT&T has a far-smaller optical access footprint, and might arguably benefit, in some of its markets, by upgrading to optical fiber. The issue is how much that would help. AT&T cannot expect much revenue lift from video, as it already is the largest U.S. linear video distributor. It is losing fixed network voice lines, as are all providers.
So AT&T would have to risk the value of the optical fiber upgrade, in the consumer segment, on internet access alone. Most who have looked at that model would agree it is likely not going to produce a positive financial result. The incremental gain in internet access share alone does not justify fiber to home investment on a ubiquitous scale.
Some of us believe the financial return from new optical deployments is not from fiber-to-home, but from from deep fiber to support 5G and business customers. That is the new twist as we move into the 5G era. At least in the U.S. market, the financial upside from FTTH increasingly gets worse as use of the fixed network by consumers declines, and as cable operators dominate what remains of revenue and market share in that market.

A new era is coming.

Why AT&T Changed its Thinking about 5G Fixed Wireless

Over the last couple of years, AT&T has seemingly waned and waxed about commercial upside for 5G fixed wireless. For most of 2018, AT&T officials had expressed skepticism about the business model.

That thinking might have been based in large part on use of millimeter wave frequencies (28 GHz, 39 GHz) AT&T owns.

But the ability to use spectrum in the 3.6-GHz region (Citizens Broadband Radio Service) should require less-dense cells and backhaul, perhaps four to 16 times less dense.

Also, some of us would argue the business case for 5G increasingly looks like an enterprise play. Sure, consumers eventually will upgrade to 5G. But that is mostly substitution of one source for another, and does not necessarily increase total revenue much.

On the other hand, there long has been thinking that the latest generations of optical fiber for communications (NG-PON 2, for example) mostly are useful for business applications and mobile backhaul, not consumer mobility.

Combine that with the need for optical fiber densification and the logic is that 5G produces incremental new revenues in the business customer segment and provides internal value for 5G backhaul. Consumer revenues are mostly an afterthought.

All that might have brought a change of views about use of fixed wireless.

“I will say over time three to five year time horizon unequivocally 5G will serve as a broadband, a fixed broadband replacement product,” says AT&T CEO Randall Stephenson. “I am very convinced that that will be the case.”

“You know, we back in the 90s everybody was saying wireless would never serve as a substitute for fixed line voice because there wasn't sufficient capacity,” Stephenson said. “Well it is a substitute for voice.”

Right now, AT&T says it has 11 million fiber to home locations and eight million business locations. AT&T also expects to reach 14 million consumer fiber-to-home locations soon. It probably is worth noting that AT&T’s fixed network passes--is able to sell services to--as many as 62 million U.S. homes.

In other words, AT&T might soon pass 22.5 percent of its consumer locations with optical fiber drops.

Even without quantifying the matter, if AT&T has managed to build optical fiber to less than a quarter of its U.S. homes, and also believes 5G will provide a workable substitute within three to five years, it is hard to see the logic of continuing to build consumer optical fiber connections, at a time when consumer fixed line accounts are shrinking overall.

Phoenix Center Analysis is an Issue in Court Review of Net Neturality

Not often does a single academic paper become a material issue when a major communications policy is adopted or reviewed by courts. But that seems to be the case for an analysis of capital investment levels written by George Ford, chief economist at the Phoenix Center.

The U.S. Court of Appeals for the District of Columbia Circuit is about to hear a challenge to the Federal Communications Commission’s 2018 Restoring Internet Freedom Order. The court review may turn on the FCC’s use of Ford’s analysis of the impact of Title II common carrier regulation on U.S. internet service provider capital investment.

Ford's analysis and the latest defense of his findings on capital investment in the wake of the common carrier classification seem once again to be a part of the policy review.

How Much New Connectivity Spending Does Consumer IoT Generate?

The perhaps common belief is that the growing use of connected sensors and devices is going to drive additional connectivity revenues, and that is true to some extent. But it is likely that most of the connections will not create additional connectivity account demand.

Consider consumer and household use of connected appliances and devices. Today, most smartphone users connect to Wi-Fi when indoors. That does not create demand for new mobile connections in a direct sense.

Likewise, our internet TV devices use Wi-Fi. Our smartwatches use Bluetooth. Most of the other devices, ranging from security cameras to appliance sensors, rely on Wi-Fi.

Only the more “industrial” sensing functions in our homes tend to create demand for additional communications circuits. Gas and power meters provide good examples.

In fact, though the details are likely to change as IoT actually becomes commercialized on a wide scale, though some new mobile connections are created, a fair number of other local connection technology platforms might emerge as significant pathways.

In many cases, those local connections will be to a local edge server most of the time, with external wide area communications on a less-frequent basis. So there is some additional impact on WAN data traffic, but not necessarily a direct increase in the number of connections because of additional IoT sensor or device use.


By some estimates, about 22 percent of IoT spending will be for connectivity. But some estimates suggest connectivity will be far less than that, as a percentage of total spending on IoT.
source: Bain

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...