Astadia Releases Reference Architectures for Migrating Unisys and IBM Mainframe Workloads to Microsoft Azure

The content below is taken from the original (Astadia Releases Reference Architectures for Migrating Unisys and IBM Mainframe Workloads to Microsoft Azure), to continue reading please visit the site. Remember to respect the Author & Copyright.

Astadia

,
the premier technology consultancy focused on modernizing mainframe
workloads, today announced the release of two separate reference
architectures for moving Unisys and IBM mainframe workloads to Microsoft
Azure’s public cloud computing infrastructure. Astadia has moved
mainframe workloads to distributed environments for more than 25 years
and have applied that expertise to architecting the solution for
customers adopting Azure.

Mainframes
still run significant workloads on behalf of commercial and public
sector organizations; yet the cost of maintaining these platforms
increases annually while the availability of skilled workers rapidly
declines. Azure is now ready to support these workloads with security,
performance and reliability, fueling new digital transformation and
innovation.

“For
decades, mainframes have traditionally housed the most mission critical
applications for an organization,” said Scott Silk, Astadia Chairman
and CEO. “Microsoft Azure is ready to take on these workloads and
Astadia is ready to help organizations make the move with a low-cost,
low-risk approach, and then provide ongoing services to manage the
resulting environment.”

“Astadia
has been a trusted Microsoft platform modernization partner for years,”
said Bob Ellsworth, Microsoft’s Director of Enterprise Modernization
and Azure HiPo Partners. “Astadia is a proven mainframe applications and
database consultancy and their focus on Azure will benefit numerous
enterprise companies.”

Astadia’s Mainframe to Azure Reference Architectures Built on Decades of Experience

Astadia
has completed over 200 successful platform modernization projects and
has a proven methodology, best practices and proprietary tools for
discovery, planning, implementation and on-going management and support.
The Mainframe to Cloud Reference Architectures cover the following
topics:

  • Drivers and challenges associated with modernizing mainframe workloads
  • A primer on the specific mainframe architectures
  • A primer on the Microsoft Azure architecture
  • Detailed Reference Architecture diagrams and accompanying narrative

Availability

The Unisys to Microsoft Azure Reference Architecture is available for free download at http://bit.ly/2u5zz2K

The IBM Mainframe to Microsoft Azure Reference Architecture is available for free download at http://bit.ly/2v487zt

skyfonts (5.9.2.0)

The content below is taken from the original (skyfonts (5.9.2.0)), to continue reading please visit the site. Remember to respect the Author & Copyright.

SkyFonts allows you to install fonts from participating sites such as google fonts and keep them up to date

6502 Retrocomputing Goes to the Cloud

The content below is taken from the original (6502 Retrocomputing Goes to the Cloud), to continue reading please visit the site. Remember to respect the Author & Copyright.

In what may be the strangest retrocomputing project we’ve seen lately, you can now access a virtual 6502 via Amazon’s Lambda computing service. We don’t mean there’s a web page with a simulated CPU on it. That’s old hat. This is a web service that takes a block of memory, executes 6502 code that it finds in it, and then returns a block of memory after a BRK opcode or a time out.

You format your request as a JSON-formatted POST request, so anything that can do an HTTP post can probably access it. If you aren’t feeling like writing your own client, the main page has a form you can fill out with some sample values. Just be aware that the memory going in and out is base 64 encoded, so you aren’t going to see instantly gratifying results.

You may not be familiar with Amazon Lambda. It is the logical extension of the Amazon cloud services. Time was that you paid to have a server in a data center. The original Amazon cloud services let you spin up a virtual server that could come into existence when needed. You could also duplicate them, shut them down, and so on. However, Lambda is even one step further. You don’t have a server. You just have a service. When someone makes a request, the Amazon servers handle it. They also handle plenty of other services for other people.

There’s some amount of free service, but eventually, they start charging you for every 100 ms of execution you use. We don’t know how long the average 6502 program runs.

Is it practical? We can’t think of why, but we’ve never let that stop us from liking something before. Just to test it, we put the example code into an online base64 decoder and wound up with this:

a9 01 8d 00 02 a9 05 8d 01 02 a9 08 8d 02 02

Then we went over to an online 6502 disassembler and got:

* = C000
C000 A9 01 LDA #$01
C002 8D 00 02 STA $0200
C005 A9 05 LDA #$05
C007 8D 01 02 STA $0201
C00A A9 08 LDA #$08
C00C 8D 02 02 STA $0202
C00F .END

We then ran the 6502cloud CPU and decoded the resulting memory output to (with a bunch of trailing zeros omitted):

01 05 08 00 00 00 00 00

So for the example, at least, it seems to work.

We’ve covered giant 6502s and small 6502 systems. We have even seen that 6502 code lives inside Linux. But this is the first time we can remember seeing a disembodied CPU accessible by remote access in the cloud.

Filed under: internet hacks, Microcontrollers

Tesla is building world’s largest backup battery in Australia

The content below is taken from the original (Tesla is building world’s largest backup battery in Australia), to continue reading please visit the site. Remember to respect the Author & Copyright.

After blackouts left 1.7 million residents without electricity, Elon Musk famously guaranteed that Tesla could supply 100 megawatts of battery storage in 100 days. The company has announced it will do just that, supplying a Powerpack battery storage system that can run over 30,000 homes. The 100-megawatt project "will be the highest power battery system in the world by a factor of three," tweeted CEO Elon Musk. It will back up the 315 megawatt Hornsdale Wind Farm, charging during low energy usage and providing electricity for peak hours.

Though the company seemed destined to get the job, the South Australian government picked it after a "competitive bidding process," Tesla said. It added that the size of the system will be enough to cover the 30,000 or so homes in the region that were affected by blackouts.


Tesla’s Powerpack battery storage system (AOL/Roberto Baldwin)

Those power outages set off a political conflagration that culminated in a very testy press conference with South Australia’s Premier and the Federal Environment Minister. Shortly afterwards, Prime Minister Malcolm Turnbull unveiled a $1.5 billion plan to expand the power grid to run an additional 500,000 homes, including backup battery storage.

That was when Tesla Energy head Lyndon Rive stepped in and made his "100 megawatts in 100 days" pledge, and (his cousin) Musk upping the ante by promising the system would be free if they didn’t achieve the goal.

Musk confirmed that he’d keep the promise, telling Australia’s ABC News that "if South Australia is willing to take a big risk, then so are we." The 100 day pledge reportedly begins once the grid interconnection agreement is inked, and Musk estimates that it will cost him "probably $50 million or more" if the installation isn’t completed in time.

Via: Elon Musk (Twitter)

Source: Tesla

aws-password-extractor (1.0.2)

The content below is taken from the original (aws-password-extractor (1.0.2)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Behind the scenes of Slovakians’ fight to liberate their .sk domain

The content below is taken from the original (Behind the scenes of Slovakians’ fight to liberate their .sk domain), to continue reading please visit the site. Remember to respect the Author & Copyright.

Analysis The Slovakian internet community is pressuring its government to block the sale of the country’s .sk internet registry, asking for it to “be returned to the people of Slovakia.”

Having run the top-level domain (TLD) for over a decade, registry operator SK-NIC announced earlier this year that it was planning to sell .sk and was in talks with London-based Centralnic, which also operates a number of other general and country-specific top-level domains.

The registry has a healthy 360,000 registered domains and charges €10 wholesale to registrars (who then sell the domains onto internet users with a markup). But its systems are outdated and the domain name market has moved rapidly in recent years with the creation of more than 1,000 new generic TLDs.

That makes .sk a poor investment for SK-NIC – which is owned by one of Slovakia’s largest telcos, DanubiaTel – and a good acquisition for Centralnic, which already has the modern systems and infrastructure to sell domain names.

However when the news broke that a foreign company was looking to buy the .sk registry, protests were launched claiming, among other things, that the registry had been “stolen” and was now being sold off for profit to a foreign company.

One group has set up a petition that has attracted just under 10,000 signatures and aims to “create public pressure on decision-makers and return .SK back to the community and people.”

A number of talks and blog posts have also emerged arguing that a new non-profit organization needs to be set up to run .sk for the people, rather than a publicly listed company.

Those efforts criticize any move away from the current (outdated) registration system, named FRED, any effort to open up registration of .sk domains to people living outside Slovakia, and are increasingly critical of a small number of government officials that seemingly approve of and are pushing the sale.

Huh

And if those last few complaints seem unusually industry specific and political, that’s because of who is behind the campaign: a new breed of internet entrepreneurs who have recently set up their own political party, Progressive Slovakia, and are hoping to steal seats from the mainstream parties.

The people in the party behind the push also happen to be in charge of or employed by the exact same registrars that would profit most from a new non-profit organization running .sk over which they had significant influence.

As for squaring complaints about the .sk registration systems being out of date with insisting that the old FRED registration system be retained: that is almost certainly because it would cost those companies time and money to shift to a new registration system.

And the concerns about making .sk domains available outside Slovakia? It has become common practice for country-code top-level domains to be opened up to anyone worldwide interested in a specific ending. In most cases, it has led to a positive situation where companies use country-specific domains for that market and everyone benefits from a larger registry.

However, if Slovakia’s market is opened up and the registration system is moved to an industry standard, it means that large global domain registrars – like GoDaddy, for example – will bring serious competition overnight.

In addition, the main claim that the Slovakian top-level domain was “stolen” from a university and given to telco company by government officials doesn’t hold much water.

It was extremely common in the early days of the internet for country-code top-level domains to be run by universities and then, as demand grew and national governments became interested, most countries moved to a model exactly like SK-NIC’s: a new non-profit run by a company with technical expertise and with a board structure that included government, business and internet community voices.

The exact same approach was introduced in the UK and .uk domain names with Nominet.

Let’sEncrypt – Wildcard Certificates Coming January 2018

The content below is taken from the original (Let’sEncrypt – Wildcard Certificates Coming January 2018), to continue reading please visit the site. Remember to respect the Author & Copyright.

This will make it easier to secure web servers for internal, non-internet facing/connected tools. This will be especially helpful for anyone whose DNS service does not support DNS-01 hooks for alternative LE verifications. Generate a wildcard CSR on an internet facing server then transfer the valid wildcard cert to the internal server.

 

http://bit.ly/2sLpY1r

Announcing new set of Azure Services in the UK

The content below is taken from the original (Announcing new set of Azure Services in the UK), to continue reading please visit the site. Remember to respect the Author & Copyright.

We’re pleased to announce the following services which are now available in the UK!

Azure Container Service –  Azure Container Service is the fastest way to realize the benefits of running containers in production. It uses customers’ preferred choice of open source technology, tools, and skills, combined with the confidence of solid support and a thriving community ecosystem. Simplified configurations of proven open source container orchestration technology, optimized to run in the Azure cloud, are provided. In just a few clicks, customers can deploy in production container-based applications, and on a framework designed to help manage the complexity of containers deployed at scale. Unlike other container services, Azure Container Service is built on 100% open source software and offer a choice between open source orchestrators Kubernetes, DC/OS, or Docker Swarm with Swarm mode.
The UK region is the first Azure region featuring Docker Swarm mode instead of legacy Swarm.

Learn more about Container Service.

20170628 UK Blog

Log Analytics – Azure Log Analytics is a service in the Operations Management Suite (OMS) offering that monitors your cloud and on-premises environments to maintain their availability and performance. It collects data generated by resources in your hybrid cloud environments and from other monitoring tools to provide insights and analysis and help you detect and respond to issues quickly.
With the availability of Log Analytics in the UK, you can now access a full set of operations management and security services (Log Analytics, Automation, Security Center, Backup and Site Recovery) in the UK.

Learn more about Log Analytics.

Logic Apps –  Logic Apps provide a way to simplify and implement scalable integrations and workflows in the cloud. It provides a visual designer to model and automate your process as a series of steps known as a workflow. Logic Apps is a fully managed iPaaS (integration Platform as a Service) allowing developers not to have to worry about building, hosting, scalability, availability and management. Logic Apps scale up automatically to meet demand.

Learn more about Logic Apps.

Azure Stream Analytics –  Azure Stream Analytics is a fully managed, cost effective real-time event processing engine that helps to unlock deep insights from data. Stream Analytics makes it easy to set up real-time analytic computations on data streaming from devices, sensors, web sites, social media, applications, infrastructure systems, and more.

With a few clicks in the Azure portal, you can author a Stream Analytics job specifying the input source of the streaming data, the output sink for the results of your job, and a data transformation expressed in a SQL-like language. You can monitor and adjust the scale/speed of your job in the Azure portal to scale from a few kilobytes to a gigabyte or more of events processed per second.
Stream Analytics leverages years of Microsoft Research work in developing highly tuned streaming engines for time-sensitive processing, as well as language integrations for intuitive specifications of such.

Learn more about Stream Analytics.

SQL Threat Detection –  SQL Threat Detection provides a new layer of security, which enables customers to detect and respond to potential threats as they occur by providing security alerts on anomalous activities. Users will receive an alert upon suspicious database activities, potential vulnerabilities, and SQL injection attacks, as well as anomalous database access patterns. SQL Threat Detection alerts provide details of suspicious activity and recommend action on how to investigate and mitigate the threat. Users can explore the suspicious events using SQL Database Auditing to determine if they are caused by an attempt to access, breach, or exploit data in the database. Threat Detection makes it simple to address potential threats to the database without the need to be a security expert or manage advanced security monitoring systems.

Learn more about SQL Threat Detection.

SQL Data Sync Public Preview –  SQL Data Sync (Preview) is a service of SQL Database that enables you to synchronize the data you select across multiple SQL Server and SQL Database instances. To synchronize your data, you create sync groups which define the databases, tables and columns to synchronize as well as the synchronization schedule. Each sync group must have at least one SQL Database instance which serves as the sync group hub in a hub-and-spoke topology.

Learn more about Azure SQL Data Sync.

Managed Disks SSE (Storage Service Encryption) –  Azure Storage Service Encryption (SSE) is now supported for Managed Disks. SSE provides encryption-at-rest and safeguards your data to meet your organizational security and compliance commitments.
Starting June 10th, 2017, all new managed disks, snapshots, images and new data written to existing managed disks are automatically encrypted-at-rest with keys managed by Microsoft.

Learn more about Storage Service Encryption for Azure Managed Disks.

We are excited about these additions, and invite customers using the UK Azure region to try them today!

Einride’s self-driving truck looks like a giant freezer on wheels

The content below is taken from the original (Einride’s self-driving truck looks like a giant freezer on wheels), to continue reading please visit the site. Remember to respect the Author & Copyright.

Einride has just revealed the prototype of the T-pod, its autonomous electric truck. The Swedish company’s self-driving vehicle can transport 15 standard pallets and can travel 124 miles on one charge. And because there’s no need for a person to sit inside of it, the T-pod also has no cab space and no windows — giving it a very futuristically odd look.

The truck uses a hybrid driverless system. While on highways, the T-pod drives itself, but on main roads, a human will remotely manage the driving system. People will also monitor T-pods as they drive on highways in case a situation arises that necessitates human control. Einride is currently working on charging stations for the trucks.

Einride isn’t the only company working on driverless shipping trucks. Waymo, Uber and Daimler are among the companies also developing similar vehicles. For shipping at larger scales, self-navigating and remote-controlled ships as well as massive drones are also in the works.

The T-pod prototype isn’t fully developed quite yet, but Einride expects to have its first completed truck available to customers in the fall. By 2020, the company plans to have a fleet of 200 goofy-looking trucks that will travel between Swedish cities Gothenburg and Helsingborg, carrying an expected two million pallets per year.

Source: VentureBeat

MusicBrainz User Survey

The content below is taken from the original (MusicBrainz User Survey), to continue reading please visit the site. Remember to respect the Author & Copyright.

It’s hard to stress how much MusicBrainz depends on the community behind it. In 2016 alone 20.989 editors made a total of 5.935.653 edits at a continuously increasing rate.

But while MusicBrainz does collect data on a lot of different entities, its users are not one of them, and the privacy policy is pretty lean.
Unfortunately this does make it fairly difficult to find out who you are, how you use MB and why you do it.

Seeing as this kind of information is fairly important for the upcoming project of improving our user experience, I volunteered to create a survey to allow you to tell us how you use MB, what you like about it and what you don’t like quite as much.

So without further ado, click on the banner to get to the survey: (It shouldn’t take more than 15 minutes of your time.)
MusicBrainz User Survey

Now if you’re still reading this blog post, that hopefully means you’ve already completed the survey! I’d like to thank Quesito who joined this project earlier this year and has been a great deal of help, our former GCI student Caroline Gschwend who helped with the UX part of the survey, CatQuest who has been around to give great feedback since the first draft and of course also all the other people who helped bring this survey to the point of release.

If you’ve got any feedback on or questions about the survey itself, please reply to the Discourse forum topic.

Volvo will only make electric and hybrid cars starting in 2019

The content below is taken from the original (Volvo will only make electric and hybrid cars starting in 2019), to continue reading please visit the site. Remember to respect the Author & Copyright.

Volvo is done with entirely traditional engines and exclusively gas-powered vehicles, the company announced. By 2019, Volvo group intends to offer only either fully electric or hybrid engines on all new models, making it the first automaker to commit to using only alternative drive trains.

The end of the solely combustion engine-powered car did seem like an eventual inevitability, given the advantages of electric and hybrid from a manufacturing and performance standpoint, and given the industry’s heavy investment in autonomous vehicle tech. But for Volvo to commit to going entirely hybrid and electric just two years from now is still the strongest sign yet we’ve seen that the purely combustion engine’s days might be numbered sooner, rather than later.

Volvo already had a target sales figure of 1 million electric and hybrid cars by 2025, and now that target seems well within reach given it’s all it’ll be selling in terms of new vehicles. Volvo also announced it would launch five new electric and hybrid cars between 2019 and 2021, and that two of those will be made by Polestar, which the company recently announced would become its own subsidiary and brand selling performance EVs, likely to compete with high-end Tesla models.

Part of the cost benefit of making electric cars is dealing with ever stronger emissions requirements on vehicles, which are set to get tighter in most key international markets, including China, which is where Volvo’s owner Geely is based. Production costs of EV parts and batteries are also getting smaller, as capacity and manufacturing processes improve.

Volvo getting a jump on the move to electric, with hybrids included as a transitional stopgap, is a smart and aggressive move for claiming a leadership position in the market of the future. A lot of companies talk about their work with, and commitment to alternative drivetrains, but this is really putting you money where you mouth is.

Featured Image: Doug Geisler

Choosing the right compute option in GCP: a decision tree

The content below is taken from the original (Choosing the right compute option in GCP: a decision tree), to continue reading please visit the site. Remember to respect the Author & Copyright.

By Terrence Ryan, Developer Advocate and Adam Glick, Product Marketing Manager

When you start a new project on Google Cloud Platform (GCP), one of earliest decisions you make is which computing service to use: Google Compute Engine, Google Container Engine, App Engine or even Google Cloud Functions and Firebase.

GCP offers a range of compute services that go from giving users full control (i.e., Compute Engine) to highly-abstracted (i.e., Firebase and Cloud Functions), letting Google take care of more and more of the management and operations along the way.

Here’s how many long-time readers of our blog think about GCP compute options. If you’re
used to managing VMs and want a similar experience in the cloud, pick Compute Engine. If you use containers and Kubernetes, you can abstract away some of the necessary management overhead by using Container Engine. If you want to focus on your code and avoid the infrastructure pieces entirely, use App Engine. Finally, if you want to focus purely on code and build microservices that expose API endpoints for your applications, use Firebase and Cloud Functions.

Over the years, you’ve told us that this model works great if you have no constraints, but can be challenging if you do. We’ve heard your feedback and propose another way to choose your compute options using a constraint-based set of questions. (It should go without saying that we’re considering very small aspects of your project.)

1. Are you building a mobile or HTML application that does its heavy lifting, processing-wise, on the client? If you’re building a thick client that only relies on a backend for synchronization and/or storage, Firebase is a great option. Firebase allows you to store complex NoSQL documents (or objects if that’s how you think of them) and files using a very easy-to-use API and client available for iOS, Android and Javascript. There’s also a REST API for access from other platforms.

2. Are you building a system based more on events than user interaction? In other words, are you building an app that responds to uploaded files, or maybe logins to other applications? Are you already looking at “serverless” or “Functions as a Service” solutions? Look no further than Cloud Functions. Cloud Functions allows you to write Javascript functions that run on Node.js and that can call any one of our APIs including Cloud Vision, Translate, Cloud Storage or over 100 others. With Cloud Functions, you can build complex individual functions that get exposed as microservices to take advantage of all our services without having to maintain systems and glue them all together.

3. Does your solution already exist somewhere else? Does it include licensed software? Does it require anything other than HTTP/S? If you answered “no,” App Engine is worth a look. App Engine is a serverless solution that runs your code on our infrastructure and charges you only for what you use. We scale it up or down for you depending on demand. In addition, App Engine has access to all the Google SDKs available so you can take advantage of the full Google Cloud ecosystem.

4. Are you looking to build a container-based system built on Kubernetes? If you’re already using Kubernetes on GCP, you should really consider Container Engine. (You should think about it wherever you’re going to run Kubernetes actually.) Container Engine reduces building a Kubernetes solutions to a single click. Additionally, it auto-scales Kubernetes cluster members, allowing you to build Kubernetes solutions that grow and contract based on demand.

5. Are you building a stateful system? Are you looking to use GPUs in your solution? Are you building a non-Kubernetes container-based solution? Are you migrating an existing on-prem solution to the cloud? Are you using licensed software? Are using protocols other than HTTP/S? Have you not found another solution to meet your needs? If you answered “yes” to any of these questions, you’re probably going to need to run your solution on virtual machines on Compute Engine. Compute Engine is our most flexible computing product, and allows you the most freedom to configure and manage your VMs however you like.

Put all of these questions together and you get the following flowchart:

This is by no means a comprehensive decision tree, and each one of our products supports a wider range of use cases than is presented here. But this should be a good guide to get you started.

To find out more about or computing solutions please check out Computing on Google Cloud Platform and then try it out for yourself today with $300 in free credits when you sign up.

Happy building!

Alexa is learning more new skills every day

The content below is taken from the original (Alexa is learning more new skills every day), to continue reading please visit the site. Remember to respect the Author & Copyright.

Just two months after Amazon announced it was "doubling down" on its Echo ecosystem, the company has confirmed that its Alexa voice platform has passed 15,000 skills. Impressive, especially in comparison to Google Assistant’s 378 voice apps and Cortana‘s meager 65 — but what’s more impressive is the rate at which Alexa is gaining these skills.

Alexa reached 15,000 skills in June — during this month alone new skill introductions increased by 23 percent. The milestone also represents a 50-percent increase in skills since February, when Amazon officially announced it had hit 10,000 — and even that figure was triple what it was the previous September.

Alexa is gaining skills rapidly, which is no doubt part of Amazon’s plan to maintain its dominance in the voice-powered device landscape — it’s on track to control 70 percent of the market this year. Of course, its acceleration in June may well have been spurred on by Apple’s announcement that a similar product, the Siri-powered HomePod, will be on the market come December.

Does the HomePod represent serious competition to Amazon? Not yet. For a start, the HomePod’s unique selling point seems to be the quality of its speakers, rather than the assistant that comes with it, and no-one knows yet whether third-party developers will be able to create HomePod-compatible apps.

And yet Amazon is expanding its Alexa ecosystem at a dizzying rate, and throwing up some red flags as it does. Developers creating popular game skills are being given cash rewards, but there’s no overarching tool to allow creators to make money from their apps, nor is there a team dedicated to monitoring app service violations.

This focus on ecosystem expansion, instead of refinement, could well cause problems for Amazon down the line, but certainly at this stage the company needn’t worry about the competition. Anyway, no doubt there will soon be a skill that lets Alexa do the worrying instead.

Via: TechCrunch

Source: Voicebot

This little 64-bit NanoPi went to wireless

The content below is taken from the original (This little 64-bit NanoPi went to wireless), to continue reading please visit the site. Remember to respect the Author & Copyright.

The $25 NanoPi Neo Plus2 SBC combines the WiFi, Bluetooth, and 8GB eMMC of the Neo Air with the quad -A53 Allwinner H5 of the Neo2, and boosts RAM to 1GB. Despite bulking up in one dimension to 52 x 40mm, FriendlyElec’s NanoPi Neo Plus2 is still part of the headless, IoT-oriented Neo family, joining […]

Troubleshoot OneNote problems, errors & issues in Windows 10

The content below is taken from the original (Troubleshoot OneNote problems, errors & issues in Windows 10), to continue reading please visit the site. Remember to respect the Author & Copyright.

Microsoft OneNote is an excellent software for gathering information and collaborating with multiple users. The software has been updated and gets better over time – but nothing is perfect at the end of the day, and there may be times when you will have to troubleshoot OneDrive errors & problems. This post runs you through some of the issues which you may face at some point.

OneNote

OneNote problems, errors & issues

If Microsoft OneNote is not working, then this post will help you troubleshoot & fix OneNote problems, errors & issues in Windows 10, and suggest workarounds too.

Open Notebooks created in earlier versions of OneNote

The later versions of OneNote support documents in the 2010-2016 format. If a user tried to open a document created in OneNote 2003 or OneNote 2007, it would open directly. However, the documents could be converted to the appropriate format, so they work well with the later versions of OneNote. This could be done as follows:

  1. Open the notebook on OneNote 2016 or 2013, (even though it might not display properly).
  2. Select the File tab and click on Info.
  3. Next, to the name of your notebook, click on settings and then on properties.
  4. In the window that opens, choose Convert to 2010-2016.
  5. The converted file could be opened on Windows mobile as well.

OneNote can’t open my page or section

If you see “There’s a problem with the contents in this section,” error message, open the notebook in the desktop version of OneNote, which provides notebook recovery options.

SharePoint related errors with OneNote

Most errors reported on OneNote are with sites shared on SharePoint. Please log on to the system as an administrator before proceeding with the steps.

Syncing issue with My SharePoint Notebook

OneNote supports versions of SharePoint that are newer than SharePoint 2010. Older versions won’t be supported and that is a part of the built.

TIP: This post will show you how to enable or disable syncing of files from OneNote.

Turn off  Check-in/Check Out in SharePoint Document Library

  1. Open the document library in SharePoint.
  2. In the ribbon for Library Tools, select Library, then Library settings and then Versioning settings.
  3. Change the value of Require Check Out to No.

Turn off Minor Versions in SharePoint Document Library

  1. Open the document library in SharePoint.
  2. In the ribbon for Library Tools, select Library, then Library settings and then Versioning settings.
  3. Change the value of Document Versioning History to No Versioning.

Turn off Required Properties in SharePoint Document Library

  1. Open the document library in SharePoint.
  2. In the ribbon for Library Tools, select Library, then Library settings.
  3. Find the table titled Columns on the window and check if any of the items under the Required column have a check mark.
  4. Should you find any item marked as required, then set its value to No.

OneNote Quota errors

Storage issues might also be a problem for those working with OneNote. Some of the issues with exceeded quota limits could be mitigated as below, says Microsoft.

To start with, figure out if the notebook is stored on OneDrive or SharePoint. The difference could be analyzed by observing the URL. OneDrive URL’s will have some variant of OneDrive in them. SharePoint URLs are company specific.

  1. If your notebook is on OneDrive, check if you could free space on the OneDrive or else you could buy more space as well.
  2. If you have exceeded the limit for SharePoint, you might need to contact the SharePoint administrator for help.

OneNote is not working

If the OneNote desktop software is not working, you may Repair your Microsoft Office installation via the Control Panel. This will reinstall the Microsoft OneNote software too.

If the OneNote Windows Store app is not working on your Windows 10 PC, then you may uninstall using our 10AppsManager for Windows 10. Once done, you may install it again by searching for in the Windows Store.

More OneNote help topics: