Intel has been very consistent with the release of processors, according to eWeek’s story on the challenges the company now is having. It reports that CEO Brian Krzanich said the company is adding a third 14-namometer processor, “Kaby Lake,” and pushing back its road map for the move to 10 nm chips:
The decision highlights the increasing manufacturing challenges Intel and other chip makers are running into as they shrink the circuitry of the chip. Intel had similar issues as it moved from the 22nm manufacturing process, delaying the launch of the 14nm "Broadwell" architecture by several months.
Cannonlake, the 10 nm chip, was expected next year. Now it likely will be released during the second half of 2017, eWeek said.
Moore’s Law, which predicts the steady pace of the growth of transistors in integrated circuits, has held up for a half century. However, the traditional processor approaches are bumping up against the limits of physics. The smaller the processor, the closer the elements must be placed and the more intractable the challenges become. At some point, Moore’s Law, at least as it relates to traditional approaches, will no longer apply. This is why quantum computing and even more esoteric concepts are generating interest these days.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
Broadband for the Poor
The Department of Housing and Urban Development is planning a pilot project that will provide broadband and digital training to more than 200,000 people in public or assisted housing in 28 communities.
The Hill says that ConnectHome will be supported by local governments, telecom firms and nonprofits. It will work with children in low income environments. CenturyLink, Cox Communications, Sprint and Google Fiber are among the companies involved in the project.
Employers Not the Only Ones Worried About BYOD Security
Much of the drama surrounding Bring Your Own Device (BYOD) focuses on whether employees can keep their companies’ data secure and, if so, how. There is another angle to the data security issue, however. The question is a pointed one: Can companies be trusted with the data of their employees?
The jury still is out. A MobileIron survey of 3,500 workers in the U.S., UK, France, Germany, Spain and Japan found that only 61 percent felt that their data is indeed safe. About half, the report in Network World said, are uncomfortable with the idea of employees having access to their personal emails, contacts and texts.
The IIoT to Help Keep Things Moving
The Industrial Internet of Things (IIoT) can be a potent tool, according to Computerworld. National Instruments and IBM are exploring its use to keep vital infrastructure operational. The two have built a test bed capable of tracking the extensive array of sensors and analytic assets deep within industrial infrastructure. This allows real-time monitoring, which can both prevent downtime and save money:
Utilities are interested in cloud computing for the same reasons as other companies, he said: Buying, deploying and maintaining big data centers inside an enterprise is expensive and difficult, and public clouds can be a simpler alternative. But there are a host of other advances coming with IoT that could make industrial monitoring and maintenance much better. For one thing, today many kinds of equipment get tested only as often as a specialized engineer can come around to check them out, Smith said. Built-in IoT sensors collect data all the time.
Utilities, rail lines, heavy equipment and mining are among the potential beneficiaries of this approach.
An Interim Step Between 100G and 400G
The current state of the art for internal carrier transport speed is 100 gigabits per second (Gbps). This is fast, but not fast enough. LightReading’s Steve Koppman writes that the increase in demand is squeezing the transport speed standard. This raises a problem: The industry has set 400G as the next step in high-capacity networking. However, Koppman writes that it still is years away due to economic constraints on the silicon photonics that will be at its heart.
The answer, at least in the short term, is an interim generation that offers 200G of capacity. The idea, actually, is to double today’s technology:
Where very long distances aren't required, denser, "16-QAM" modulation can effectively map two 100G services onto a single wavelength, rapidly doubling bandwidth capacity without major new capital investment. 16 QAM's big drawback is limited reach. While this limitation will be solved, timing is uncertain. There are claims, difficult to verify, that improvements may be in the works in the near future, though their potential "side effects" in terms of other disadvantages are also uncertain.
Koppman writes that more than 30 mostly medium-sized and small carriers worldwide are taking this approach.
Carl Weinschenk covers telecom for IT Business Edge. He writes about wireless technology, disaster recovery/business continuity, cellular services, the Internet of Things, machine-to-machine communications and other emerging technologies and platforms. He also covers net neutrality and related regulatory issues. Weinschenk has written about the phone companies, cable operators and related companies for decades and is senior editor of Broadband Technology Report. He can be reached at firstname.lastname@example.org and via twitter at @DailyMusicBrk.