Microsoft will bring 64-bit app support to ARM-based PCs in May

The content below is taken from the original ( Microsoft will bring 64-bit app support to ARM-based PCs in May), to continue reading please visit the site. Remember to respect the Author & Copyright.

One of the biggest limitations of the Windows on Snapdragon platform is its inability to run 64-bit apps. Microsoft has said on multiple occasions that it intends to eventually offer that support, but we've not heard about a firm timeline until now….

Cloudflare makes it harder for ISPs to track your web history

The content below is taken from the original ( Cloudflare makes it harder for ISPs to track your web history), to continue reading please visit the site. Remember to respect the Author & Copyright.

If you're privacy-minded, you probably aren't thrilled that governments seem hell-bent on giving internet providers free rein over your browsing data. Cloudflare just gave you a tool to fight back, however. It launched 1.1.1.1, a free Domain Name S…

Limit Bandwidth and set the Time when Windows Updates can download – Configure BITS Settings

The content below is taken from the original ( Limit Bandwidth and set the Time when Windows Updates can download – Configure BITS Settings), to continue reading please visit the site. Remember to respect the Author & Copyright.

Windows Update

This post will show you how you can set the time window between which you can configure Windows 10 to download Windows Updates using Group Policy or Registry Editor to configure BITS Settings. Does your internet suddenly slow down, despite […]

This post Limit Bandwidth and set the Time when Windows Updates can download – Configure BITS Settings is from TheWindowsClub.com.

Now The Church of England takes Apple Pay and Google Pay

The content below is taken from the original ( Now The Church of England takes Apple Pay and Google Pay), to continue reading please visit the site. Remember to respect the Author & Copyright.

What can a church do when its younger parishioners stop carrying coins they can toss into the donation box? In the Church of England's case, it's to offer high-tech collection plates that accept Apple Pay, Google Pay and SMS mobile payments. Accordin…

Virtual Machine Serial Console access

The content below is taken from the original ( Virtual Machine Serial Console access), to continue reading please visit the site. Remember to respect the Author & Copyright.

Ever since I started working on the Virtual Machine (VM) platform in Azure, there has been one feature request that I consistently hear customers asking for us to build. I don’t think words can describe how excited I am to announce that today we are launching the public preview of Serial Console access for both Linux and Windows VMs.

Managing and running virtual machines can be hard. We offer extensive tools to help you manage and secure your VMs, including patching management, configuration management, agent-based scripting, automation, SSH/RDP connectivity, and support for DevOps tooling like Ansible, Chef, and Puppet. However, we have learned from many of you that sometimes this isn’t enough to diagnose and fix issues. Maybe a change you made resulted in an fstab error on Linux and you cannot connect to fix it. Maybe a bcdedit change you made pushed Windows into a weird boot state. Now, you can debug both with direct serial-based access and fix these issues with the tiniest of effort. It’s like having a keyboard plugged into the server in our datacenter but in the comfort of your office or home.

Serial Console for Virtual Machines is available in all global regions starting today! You can access it by going to the Azure portal and visiting the Support + Troubleshooting section. See below for a quick video on how to access Serial Console.

SerialConsole-LinuxAccess

Support for Serial Console comes naturally to Linux VMs. This capability requires no changes to existing images and will just start working. However, Windows VMs require a few additional steps to enable. For all platform images starting in March, we have already taken the required steps to enable the Special Administration Console (SAC) which is exposed via the Serial Console. You can also easily configure this on your own Windows VMs and images, outlined in our Serial Console documentation. From the SAC, you can easily get to a command shell and interact with the system via the serial console as shown here:

SerialConsole-PrivatePreviewWindows

Serial Console access requires you to have VM Contributor or higher privileges to the virtual machine. This will ensure connection to the console is kept at the highest level of privileges to protect your system. Make sure you are using role-based access control to limit to only those administrators who should have access. All data sent back and forth is encrypted in transit.

I am thrilled to be offering this service on Azure VMs. Please try this out today and let us know what you think! You can learn more in this episode of Azure Friday’s, this Monday’s special episode of Tuesday’s with Corey on Serial Console, or in our Serial Console documentation.

 

Thanks,

Corey

Europe dumps 300,000 UK-owned .EU domains into the Brexit bin

The content below is taken from the original ( Europe dumps 300,000 UK-owned .EU domains into the Brexit bin), to continue reading please visit the site. Remember to respect the Author & Copyright.

Bureaucrats break internet norms by vowing to ban Blighty-based bods from Euro TLD

Brexit has hit the internet, and not in a good way.…

The SMB’s Essential Disaster Recovery Checklist

The content below is taken from the original ( The SMB’s Essential Disaster Recovery Checklist), to continue reading please visit the site. Remember to respect the Author & Copyright.

While almost every business would agree that it’s essential to have a disaster recovery (DR) plan, the sad fact is that not all businesses do. Most of those businesses that don’t have a DR plan tend to be the smaller and medium sized businesses (SMBs) that actually need a DR plan the most. Many of these SMBs could potentially go out of business if they were hit by a disaster and they were not able to effectively recover.

Today’s businesses need more availability and uptime than at any point in the past. Plus, they need to be able to deal with potential threats like ransomware as well as disasters and site outages. Creating an essential DR checklist is an important starting point for enacting your DR strategy. Let’s take a closer look at the main points that should be on your essential DR checklist.

  • Identify your critical business processes – Not all processes are created equal. You first need to identify your most important business applications. These typically are the essential applications that your business needs to operate on a daily basis. Ideally, IT should meet with the business management and/or applications owners to identify and prioritize these applications.
  • Backup and optionally replicate your critical servers – Backup is the foundation for all DR strategies. Backup enables you to restore your servers to a known-good point in time enabling you to recover your essential IT operations. In addition, while it’s no fun, your backups need to be periodically tested. Various backup products make this easier by providing the ability to automatically test backup and restore validity. While backup provides basic protection there is the problem of data loss between the time of the backup and the disaster event. Replication is another important technology that can be a key part of your DR plan. Replication enables you to vastly improve your recovery times and reduce data loss by providing one or more replicas of your protected servers. Replication reduces data loss by providing much more frequent replication intervals than backups. Some products provide near real-time replication capabilities. In addition, restore time is typically far faster as a replica can be quickly brought online without a lengthy restore process.
  • Make sure you have an offsite backup – Having at least one copy of your backups offsite is necessary in order to recover from a site disaster. Site disasters like fires, floods or hurricanes can render an entire location along with all of its computing resources to be unusable. Keeping backup copies offsite protects them from local events ensuring that you have at least one good backup copy to use during your recovery.
  • Have a restore target (most likely in the cloud) – It’s great to have a backup but it’s even better to have somewhere to restore it to. While many larger businesses have their own private DR sites this type of technology and its accompanying expenses is far beyond the reach of most SMBs. However, using the cloud as a DR site is possible for most businesses. The cloud can house replica VMs that can be quickly started up in the event of a disaster or you can use it to restore your backups to cloud-based VMs.
  • Make assigned DR roles – Having backups and DR targets that you can restore them to lays the foundation for a successful DR plan but no plan runs itself. It’s people that put your plans into action. You need to assign well-defined roles and responsibilities to the different IT and other business personnel that are needed to help recover from a disaster.

An effective DR plan is essential and can be the difference for your business to either survive a disaster or to possibly be put out of business entirely. If you don’t have a DR plan in place following this DR checklist can be a good way to get started.

The post The SMB’s Essential Disaster Recovery Checklist appeared first on Petri.

connectwise (17.1.1.0)

The content below is taken from the original ( connectwise (17.1.1.0)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Connectwise installer for internal use

Choosing the Best Mobile Office 365 Email Client

The content below is taken from the original ( Choosing the Best Mobile Office 365 Email Client), to continue reading please visit the site. Remember to respect the Author & Copyright.

Moving Mobile Email to Office 365

I am frequently asked to recommend the best mobile client to use with Office 365. Usually, the question is what email client to use because it is in the context of a company moving from on-premises Exchange to Exchange Online. Mail is often the first workload a company moves to the cloud, so it is unsurprising that this issue arises, especially as Exchange has included native support for mobile clients since the advent of the ActiveSync server in Exchange 2003 SP1 (the real action started with Exchange 2003 SP2).

Mobile Office 365 Clients

Of course, a wide range of other mobile clients are available for other Office 365 applications, as you can see from those installed on my iPhone (Figure 1).

iPhone Office 365 Apps

Figure 1: iPhone Pro for Office 365 (image credit: Tony Redmond)

The apps receive regular updates and are generally of a high quality. iOS tends to be a little ahead of Android when it comes to functionality, but that varies from app to app. My biggest complaint at present is that the Teams mobile app still does not support switching between tenants. That feature is “coming,” just like Christmas.

The Success of Exchange ActiveSync

Originally designed to evangelize connectivity between the nascent Windows smartphones and Exchange to compete with RIM BlackBerry, Microsoft’s focus soon shifted to licensing Exchange ActiveSync (EAS) to as many mobile device vendors as possible.

Since 2006, Microsoft has done a great job of licensing EAS to all and sundry. Today, EAS is the common connectivity protocol for mobile devices for both Exchange on-premises and Exchange Online. Even Microsoft’s most ardent competitors, Google and Apple, license EAS.

The Problem with Exchange ActiveSync

Good as EAS is at connecting to Exchange, it is now an old protocol. Although Microsoft refreshed EAS (to version 16.1) last year, the functionality available through EAS is much the same as it ever was – synchronizing folders, sending and receiving email, updating the calendar, and maintaining contacts. If this is what you need, then EAS is the right protocol. And because EAS works so well, mobile device vendors can easily integrate EAS into their email clients to make them with Exchange.

Except of course when new versions of an email app appear. Apple has a notable history of problems between the iOS mail app and Exchange, ranging from longstanding problems with calendar hijacking to issues with HTTP2 when IOS 11 appeared. To be fair to both Apple and Microsoft, the two companies work together to resolve problems more effectively now than they did in the past, but the problems illustrate some of the difficulties that can creep in when mobile device vendors implement EAS.

A New Mobile Strategy

Up to late 2014, Microsoft’s strategy for mobile devices centered around EAS. Recognizing the limitations of the protocol, they also had “OWA for Devices,” essentially putting a wrapper around a browser instance running OWA on mobile devices. OWA for Devices never went anywhere fast, even if it was the only way to get certain functions on mobile devices like support for encrypted email or access to shared mailboxes.

Then Microsoft bought Acompli for $200 million to transform their mobile strategy and get them out of the hole they were heading into with OWA for Devices. The Acompli apps for iOS and Android had built up a loyal fan base because the clients worked well with Exchange, Gmail, and other servers, and included some unique functionality like the Focused Inbox, which is now available throughout the Outlook family.

Microsoft rebranded the Acompli apps as Outlook for iOS and Android in January 2015. After weathering an initial storm caused by some misleading assertions by security experts, two problems remained. First, the Outlook apps used EAS, but only to retrieve information from Exchange mailboxes and store the data on Amazon Web Services (AWS). Second, the clients used their own protocol to interact with the AWS store.

In 2016, Microsoft began to move the Outlook data from AWS to a new architecture based on Office 365 and Azure. Soon, the clients will use the same architecture to deliver the same functionality for Exchange on-premises.

The Outlook mobile clients still use their own protocol to communicate with Exchange. Why? The EAS protocol does not support all the functionality that the Outlook clients deliver, including the Focused Inbox, full mailbox search, and (most recently) protected email.

The way that Outlook deals with protected email is important. If you chose to protect email with Azure Information Protection, messages accessed through Outlook mobile client are more secure. By comparison, “unenlightened” clients like the Apple iOS mail app must remove that protection to store and display email. Coupled with Office 365 Multi-Factor Authentication (and the Microsoft Authenticator app), Outlook is a good choice for those who need the highest level of mobile email security available in Office 365.

Although the Outlook clients sometimes work differently to the way I would like than I would like, they are the best mobile client for Exchange Online.

In summary, EAS is now the lowest common denominator for Exchange connectivity while all the new features and functionality appear in the Outlook clients.

Why Microsoft Will Not Upgrade EAS

Those who like using native email clients like the iOS mail app probably wonder why Microsoft doesn’t upgrade EAS to support the functionality needed by Outlook.

In a nutshell, Microsoft could upgrade EAS, but the engineering effort to do so cannot be justified. First, their own clients would then have to be retrofitted to use the “new EAS.” Second, no guarantee exists that mobile device vendors would upgrade their mail apps to exploit the features exposed by an upgraded API. Microsoft could ask the likes of Samsung, Apple, and Google to support new features, but it is likely that they would not.

The upshot is a lot of expense for Microsoft with no prospect of any positive outcome.

Sponsored

Into the Future

My answer to people who ask about mobile apps for Office 365 is that if users are happy with the native mail apps, then continue with that course. The users don’t realize what they are missing. On the other hand, if you want users to have the best functionality, you need to use the Outlook clients. That is where Microsoft’s focus is today, and it is where new features will appear in the future. Hopefully, Microsoft will deliver some long-awaiting functionality, like support for shared mailboxes, soon.

And don’t forget the other mobile apps for Office 365. With such a selection available today, I don’t know how we ever managed to do any work on the road in the past…

Follow Tony on Twitter @12Knocksinna.

Want to know more about how to manage Office 365? Find what you need to know in “Office 365 for IT Pros”, the most comprehensive eBook covering all aspects of Office 365. Available in PDF and EPUB formats (suitable for iBooks) or for Amazon Kindle.

The post Choosing the Best Mobile Office 365 Email Client appeared first on Petri.

How to TAG files in Windows 10 & use it to make File Search efficient

The content below is taken from the original ( How to TAG files in Windows 10 & use it to make File Search efficient), to continue reading please visit the site. Remember to respect the Author & Copyright.

While Windows 10 has a powerful search inbuilt into the system, especially with Cortana which allows you to search smartly using filters like music, images, PDF and so on. One of the most underrated, but efficient way to search files […]

This post How to TAG files in Windows 10 & use it to make File Search efficient is from TheWindowsClub.com.

Boring Company to start selling LEGO-like interlocking bricks made from tunneling rock

The content below is taken from the original ( Boring Company to start selling LEGO-like interlocking bricks made from tunneling rock), to continue reading please visit the site. Remember to respect the Author & Copyright.

Elon Musk announced that the Boring Company will sell LEGO-like interlocking bricks made from rock that his tunneling machines excavate from the earth. Musk stated these bricks will be sold in “kits” and will be rated to withstand California’s earthquakes. 

The date of availability or cost of this latest product is unknown. While his company has only started digging shorter tunnels there is not enough upturned rock to begin making these bricks yet. 

Celebrate World Backup Day on March 31, 2018 – Are You Ready?

The content below is taken from the original ( Celebrate World Backup Day on March 31, 2018 – Are You Ready?), to continue reading please visit the site. Remember to respect the Author & Copyright.

There’s a “DAY” for almost anything these days, but here’s one that should be on your calendar – World Backup day , March 31, 2018. Whether its… Read more at VMblog.com.

Why PowerShell is a Core Skill for Office 365 Administrators

The content below is taken from the original ( Why PowerShell is a Core Skill for Office 365 Administrators), to continue reading please visit the site. Remember to respect the Author & Copyright.

PowerShell Office 365

PowerShell Office 365

Office 365 Pros Know PowerShell

Because I come from the Exchange side of the Office 365 house, PowerShell is a natural tool for me to turn to whenever I need to do something with Office 365 that Microsoft hasn’t included in the admin tools. The PowerShell coverage for Exchange is deep and extensive, even in the cloud. By comparison, PowerShell is not well covered in other Office 365 applications. Skype for Business Online has some administration functions while SharePoint Online offers mediocre support. Planner has no support, and the first version of the Teams PowerShell module could be so much better.

Given the spotty coverage in other parts of the service, I guess it should come as no surprise that Office 365 administrators who do not have a background in Exchange might consider PowerShell to be an odd but sometimes useful command-line interface. But that’s not the case. Simply put, PowerShell is a core skill for Office 365 administrators.

PowerShell Quirks

It’s true that PowerShell has its quirks. Like any scripting language, PowerShell syntax can be baffling and obscure, so using an IDE is the best approach for someone starting out. Writing raw PowerShell in the console is for masochists.

PowerShell has significant scalability limitations too, especially inside Office 365 where throttling controls clamp down on anyone who tries to consume resources with abandon. PowerShell will not process tens of thousands of objects rapidly, but that’s not its purpose.

If you think you need to process large numbers of Office 365 objects, listen to the recording of the seminar by MVPs Alan Byrne and Vasil Michev. The techniques they explain will help you get the job done, but it won’t be quick.

Why Admins Need PowerShell

The reasons why Office 365 administrators need to achieve a basic level of competency with PowerShell are varied. Here’s my top pick.

The Office 365 Admin Tools are Not Perfect

Beauty is in the eye of the beholder and Microsoft probably thinks that its admin tools are just fine, but some of the more interesting jobs you might want to do need you to plunge into PowerShell. A recent example is the provision of cmdlets to recover deleted items for users without the need to log into their accounts.

Another is the support article cited in my article on GDPR data spillage. The list of steps needed to discover and report all the holds in place for a mailbox that must be temporarily lifted to remove items is long and prone to error. Scripting the retrieval and release of holds for a mailbox would automate the process and make it easier to stand over in court, should the need arise to justify the removal of held information. Finally, I point to the need to enable mailbox auditing for new mailboxes to ensure audit data flows into the Office 365 Audit Log. This problem has been around for years and it’s surprising that Exchange Online does not enable auditing by default. But you can, with PowerShell.

Microsoft Cannot Anticipate Every Possible Admin Task

Try to write down all the tasks that you think an Office 365 Admin will perform in a year. Once you get past the easy stuff like creating accounts, monitoring usage reports, and so on, it becomes increasingly difficult to anticipate just what admins will be called upon to do. The Office 365 Admin Center and the other associates consoles represent a lot of functionality, but there’s always the possibility that you might have to do something that isn’t available as a menu choice in a GUI.

Two recent examples are how to archive inactive Office 365 Groups (and Teams) and how to identify when Groups and Teams are not being used. Microsoft offers the Azure Active Directory expiration policy for Groups, but this is based on time (that is, a group expires after a set period) instead of activity, which creates the possibility that Office 365 could expire and remove your most important teams or groups even though they are in active use daily. You can easily recover the expired groups (within 30 days), but that’s not the point. It’s better to understand what groups and teams are active and act on that basis.

Some Office 365 Features need PowerShell

The group expiration policy has a GUI (in the Azure portal) to work with its settings, but many Office 365 features need admins to run some PowerShell commands to set things up. The Office 365 Groups policy is a good example. If you want to set up a naming policy or restrict group creation to a defined set of users, you need PowerShell.

PowerShell Helps You Understand Office 365 Better

Understanding how a technology works is a great way to master it. For instance, running the Get-MailboxStatistics cmdlet against a group mailbox reveals its contents. You might or might not be interested in this information, but it is surprising how often detail like this has proven invaluable.

PowerShell Is Not Hard

I am not a programmer now. I used to be, with VAX COBOL and VAX BASIC, in the last millennium, but I can cheerfully hack away with PowerShell and get stuff done. Anyone can too. It’s not hard and a ton of useful examples and advice exists on the web (here’s a good start). Of course, you should never download and run a script in your production environment without carefully examining (and understanding) the code first, but that does not take away from the point that you are not alone.

Sponsored

PowerShell is Fun

Perhaps oddly, PowerShell can be fun too. A sense of achievement comes when a recalcitrant script finally works to make Office 365 give up some secrets or some piece of data becomes more understandable. Although Microsoft might create a perfect nirvana of administration within Office 365, tenant admins need some competence with PowerShell for the foreseeable future. The sooner you start, the better you’ll be.

Follow Tony on Twitter @12Knocksinna.

Want to know more about how to manage Office 365? Find what you need to know in “Office 365 for IT Pros”, the most comprehensive eBook covering all aspects of Office 365. Available in PDF and EPUB formats (suitable for iBooks) or for Amazon Kindle.

The post Why PowerShell is a Core Skill for Office 365 Administrators appeared first on Petri.

Polish bank begins using a blockchain-based document management system

The content below is taken from the original ( Polish bank begins using a blockchain-based document management system), to continue reading please visit the site. Remember to respect the Author & Copyright.

A blockchain company called Coinfirm has announced a partnership with PKO BP, a major Polish bank, to provide blockchain-based document verification using a tool called Trudatum. The project is a an actual implementation of one of the primary benefits of blockchain-based tools, namely its ability to permanently and immutably store data. This announcement brings blockchain implementations out of the realm of proof-of-concept and into the real world.

“Every document recorded in the blockchain (e.g. proof of a transaction, or bank’s terms and conditions for a given product) will be issued in the form of irreversible abbreviation or hash signed with the bank’s private key. This will allow a client to verify remotely if the files he received from a business partner or from the bank are true, or if a modification of the document was attempted,” wrote the Coinfirm team.

Coinfirm founders Paweł Kuskowski, Pawel Aleksander, and Maciej Ziółkowski have experience in cryptocurrency and banking and they bootstrapped the company over the past two years. They also run a blockchain-based AMC/KYC platform for investments that is reaching the break-even point. They entered the world of blockchain after becoming frustrated with banking but the industry sucked them back in.

“Together with Pawel Aleksander we decided to leave the banking world as we saw that the AML process in the financial industry is broken – it’s very arbitrary, takes thousands of people, and has a very low efficiency,” said Kuskowski. “Our early observation of the digital currency space and it’s challenges showed a huge need for AML solutions. Also because of the nature of the ledgers we could create a data driven machine-learning based software as opposed to the people-based process prone to human error and subjectivity that is the standard for the banking industry. Once we understood the blockchain technology better we continued to launch new products that are using it to solve compliance challenges – starting with the Coinfirm AML/KYC Platform, and then Trudatum.”

The Trudatum tool essentially allows PKO BP to create “durable media” – “a digital solution for storing all agreements with clients that is now required by the law.”

“Every document recorded in the blockchain (e.g. proof of a transaction or bank’s terms and conditions for a given product) will be issued in the form of irreversible abbreviation („hash”) signed with the bank’s private key. This will allow a client to verify remotely if the files he received from a business partner or from the bank are true or if a modification of the document was attempted,” said Kuskowski.

For their part, PKO BP is pleased with the pilot project, making it one of the first European banks to publicly admit that they’re using a blockchain tool for document management.

“Coinfirm is one of the startups that we discovered thanks to the ‘Let’s Fintech with PKO Bank Polski’ acceleration process,” said Adam Marciniak, a Vice President at PKO BP. “It already has considerable experience in blockchain technology acquired in several countries. Last year we started tests of the Trudatum platform developed by Coinfirm. As tests in the banking environment were highly satisfying, we decided to cooperate more closely. We believe that together we will be able to carry out a pioneering operation of implementing blockchain technology into the Polish banking sector.”

This electric 1959 Mini Cooper is everything that’s right in the world

The content below is taken from the original ( This electric 1959 Mini Cooper is everything that’s right in the world), to continue reading please visit the site. Remember to respect the Author & Copyright.

Take a break from the dumpster fire that is 2018. This electric Mini will make you smile.

Built as a show piece, the car features an electric powertrain in a restored 1959 Mini Cooper. Of course it’s red with a white stripe, and, and of course, there are rally lights across the grill. This is how a Mini should look, and an electric powertrain should make it feel the part, too.

Minis are supposed to be oversized go-karts that go like mad with near-instant acceleration. And that’s the best part of electric vehicles: instant torque that produces insane acceleration.

Mini hasn’t revealed the range or capabilities of this show car. Its purpose is mostly to draw attention to Mini’s other electric vehicles and concepts. Mini has been producing electric vehicles since 2008 when it created a limited run of Mini E, which was used to make the BMW i3. More recently Mini announced the Mini Electric Concept and intends to put it on the market by 2019.

But forget about that new car that’s sure to be overloaded with screens, LEDs and silly things like airbags. None of that stuff will make people smile as much as a classic Mini Cooper.

How to customize Notifications and Action Center on Windows 10

The content below is taken from the original ( How to customize Notifications and Action Center on Windows 10), to continue reading please visit the site. Remember to respect the Author & Copyright.

We all use our PC for work, and getting distracted for any reason does break the concentration. Just like your Phone, Windows 10 Apps & System does send out notifications. They are there for a reason, but if they are […]

This post How to customize Notifications and Action Center on Windows 10 is from TheWindowsClub.com.

Build your own PC inside the PC you built with PC Building Simulator

The content below is taken from the original ( Build your own PC inside the PC you built with PC Building Simulator), to continue reading please visit the site. Remember to respect the Author & Copyright.

Considering we’ve got simulators for everything from driving a junker (x2) to moving into a neighborhood with a bunch of hot dads in it, I suppose it was only a matter of time until someone made a game where you assemble your own PC. It’s called PC Building Simulator, as you might guess, and it looks fabulous.

I’ve built all my PCs over the years, including my current one, which I really should have waited on, since the early Skylake mobos were apparently trash. I’m sure we can line up the screw holes better than that, MSI!

What was I talking about? Oh yes, the simulator. This is no joke game: it uses real, licensed parts from major manufacturers, which are (or will be) simulated down to their power draws, pins, draw counts and so on. So if you pick a power supply without enough molex connectors to handle your SLI rig and PCIe solid state system drive (or whatever), it won’t start. Or if you try to close the ultra-slim case with an 8-inch-tall heatsink on your overclocked CPU, it’ll just clank. (Some of these features are still in development.)

Add LEDs inside the case, replace the side panel with acrylic (no!), try out a few cooling solutions… the possibilities are endless. Especially since manufacturers like Corsair, AMD AMD, and so on seem hot to add perfectly modeled virtual versions of their components to the selection.

There’s even a “game” aspect where you can start your own PC repair business — someone sends you a machine that won’t boot, or shuts down randomly, and you get to figure out why that is. Run a virus scan, reseat the RAM, all that. Damn, this sounds just like my actual life.

Seriously though, this is great — it might help more people get over the idea that building a PC is difficult. I mean, it is, but at least here you can go through the motions so it isn’t a total mystery when you give it a shot.

The best part is that this game is made by a teenager who put together the original as a lark (it’s free on itch.io) itch.io) and attracted so much attention that it’s been blown up into a full-blown game. Well, an Early Access title, anyway.

SUSE bakes a Raspberry Pi-powered GNU/Linux Enterprise Server

The content below is taken from the original ( SUSE bakes a Raspberry Pi-powered GNU/Linux Enterprise Server), to continue reading please visit the site. Remember to respect the Author & Copyright.

Industry can have a slice of steaming supported stability … if it can afford to pay

SUSE Linux Enterprise Server 12 SP3 (SLES) has been released for the diminutive Raspberry Pi computer.…

Yes, You Can Use Your On-Premises Data with Office 365

The content below is taken from the original ( Yes, You Can Use Your On-Premises Data with Office 365), to continue reading please visit the site. Remember to respect the Author & Copyright.

Say what? What! <edited out the back and forth dialog in my head between Samuel L. Jackson and some poor kid in an inappropriate-for-work movie that is playing in my head>

 

 

For a lot of people, hybrid just means Active Directory and works in both the cloud and on-premises. Then all of the other IT functionality is purely online or purely on-premises. Exchange is hosted in O365. They have SharePoint in both but neither talks to each other. There are a whole host of other solutions and calling them hybrid can be a stretch. I think of it more like, they have two data centers that just happen to have the same username and password.

Well, it doesn’t have to be that way. One of the smartest pieces of technology Microsoft has created in years flies under the radar but it can get you out of this “here or there” mentality. The software is called the On-Premises Data Gateway.

The On-Premises Data Gateway

The Data Gateway (that is what we are going to call it to get down on wordiness) allows you to connect to your on-premises data to several tools in the Microsoft Cloud. PowerApps, Power BI, Microsoft Flow, Azure Logic Apps, Azure Analysis services, and probably a few others all natively support the Data Gateway. This allows you to truly bridge the gap and build hybrid environments. It is also the answer to how do you deal with the fact that amazing tools like PowerApps, Power BI, and Flow will never come on-premises. No problem, you can bring on-premises to them.

Now you might be thinking, why is it you haven’t heard of this product before? Why can’t Shane tell us exactly what platforms it works with and where to get more info. Turns out, in my opinion, this is the only flaw with the Data Gateway. It’s greatest feature, is its biggest weakness.

The Best Feature

You only need one Data Gateway installed on-premises and it works with all of the tools I listed above. What simple form of genius is this? I am shocked that this single tool was built and got all of these other teams to play nice with it. Better news? Installing it takes about 5 minutes and there is almost nothing to configure. So why is this amazing news also a weakness?

The reason you haven’t heard of the tool and that the details are hard to come by is because the Data Gateway doesn’t have a standalone website. All of the documentation for the product is hidden on the various websites of the products it works with. This can be good if you only care about it from one point of view but I am a harlot when it comes to this tech. I want all up documentation. Sad face.

If you want more info or to install the product, then you need to go to the various product sites to do so. The nice thing is this gives you product-specific context. Here are the sites that I know of:

Are you overwhelmed by that? I was. So many Gateways but remember you only need one. So after you setup the Gateway for PowerApps, you are done. If next week you decide you want to also use on-premises data with Power BI, no problem. You do NOT have to install another gateway.

Speaking of installs. Because I love you, I have also gotten into this chaos. I have made two different videos. One is from the PowerApps point of view and one is from the Power BI angle. I found it makes a lot more sense to install, configure, and build something using the gateway if you do it product specific. Enjoy.

Licensing

There is no license required to install the Data Gateway. Instead, the various products are where the licensing, if any, are handled. For example, with PowerApps the Data Gateway is available to almost everyone. The only exception is being users licensed as Office Business, Office Enterprise E1, and Office 365 Enterprise F1. Once again, the downside is I can’t point you to one place where that is called out. So make sure as you make plans to use this great tool, you dig on the licensing. This is especially true if you are in a mixed licensing environment. You will most likely be fine but I want you to look before you leap. Random fact? Power BI even has a personal Gateway mode. So cool.

Security

I have bad news for your security and firewall teams. We don’t need them. Hooray. The very non-nerdy explanation for the way this works is the Data Gateway calls out to Azure Service Bus looking for work. Now if you have super outbound security and proxy servers (like we did in the late 90’s), then you may have to make some changes. But if I were you, I would not over think it, just install the Data Gateway and give it a try. If it cannot find the Azure Service Bus (the internet), it will let you know. Then you can try to configure around it.

Sponsored

There Is So Much Fun to be Had Here. What Are You Waiting For?

Imagine a Power BI dashboard showing on-premise and online data in the same report. Imagine workflows that take SharePoint content from on-premises and publish it to the cloud. The sky is the limit. You just need to deploy one instance of the Data Gateway and then you can go crazy. Leave me comments below and tell me what doors this opens for you. I love success stories.

 

Shane

@ShanesCows

The post Yes, You Can Use Your On-Premises Data with Office 365 appeared first on Petri.

DJI will let developers fully customize its drones

The content below is taken from the original ( DJI will let developers fully customize its drones), to continue reading please visit the site. Remember to respect the Author & Copyright.

Drone company DJI is expanding its efforts in the commercial sector with a new thermal imaging camera and a payload software development kit (SDK) that will allow startups and developers to integrate custom gear onto DJI drones.

Now, you can automatically document your API with Cloud Endpoints

The content below is taken from the original ( Now, you can automatically document your API with Cloud Endpoints), to continue reading please visit the site. Remember to respect the Author & Copyright.

With Cloud Endpoints, our service for building, deploying and managing APIs on Google Cloud Platform (GCP), you get to focus on your API’s logic and design, and our team handles everything else. Today, we’re expanding “everything else” and announcing new developer portals where developers can learn how to interact with your API.

Developer portals are the first thing your users see when they try to use your API, and are an opportunity to answer many of their questions: How do I evaluate the API? How do I get working code that calls the API? And for you, the API developer, how do you keep this documentation up-to-date as your API develops and changes over time?

Much like with auth, rate-limiting and monitoring, we know you prefer to focus on your API rather than on documentation. We think it should be easy to stand up a developer portal that’s customized with your branding and content, and that requires minimal effort to keep its contents fresh.

Here’s an example of a developer portal for the Swagger Petstore (YAML):

The portal includes, from left to right, the list of methods and resources, any custom pages that the API developer has added, details of the individual API method and an interactive tool to try out the API live!

If you’re already using Cloud Endpoints, you can start creating developer portals immediately by signing up for this alpha. The portal will always be up-to-date; any specification you push with gcloud also gets pushed to the developer portal. From the portal, you can browse the documentation, try the APIs interactively alongside the docs, and share the portal with your team. You can point your custom domain at it, for which we provision an SSL certificate, and add your own pages for content such as tutorials and guides. And perhaps the nicest thing is that this portal works out of the box for both gRPC and OpenAPI—so your docs are always up-to-date, regardless of which flavor of APIs you use.

Please reach out to our team if you’re interested in testing out Cloud Endpoints developer portals. Your feedback will help us shape the product and prioritize new features over the coming months.

Google Home’s multi-room audio now works with Bluetooth speakers

The content below is taken from the original ( Google Home’s multi-room audio now works with Bluetooth speakers), to continue reading please visit the site. Remember to respect the Author & Copyright.

Google Home is getting a long-awaited feature: Bluetooth. Previously, only Google Cast-enabled speakers could be looped in to a network of Home-commanded devices. Now users can pair their speaker of choice with the dedicated Home app and voice comman…

Take charge of your sensitive data with the Cloud Data Loss Prevention (DLP) API

The content below is taken from the original ( Take charge of your sensitive data with the Cloud Data Loss Prevention (DLP) API), to continue reading please visit the site. Remember to respect the Author & Copyright.

This week, we announced the general availability of the Cloud Data Loss Prevention (DLP) API, a Google Cloud security service that helps you discover, classify and redact sensitive data at rest and in real-time.

When it comes to properly handling sensitive data, the first step is knowing where it exists in your data workloads. This not only helps enterprises more tightly secure their data, it’s a fundamental component of reducing risk in today’s regulatory environment, where the mismanagement of sensitive information can come with real costs.

The DLP API is a flexible and robust tool that helps identify sensitive data like credit card numbers, social security numbers, names and other forms of personally identifiable information (PII). Once you know where this data lives, the service gives you the option to de-identify that data using techniques like redaction, masking and tokenization. These features help protect sensitive data while allowing you to still use it for important business functions like running analytics and customer support operations. On top of that, the DLP API is designed to plug into virtually any workload—whether in the cloud or on-prem—so that you can easily stream in data and take advantage of our inspection and de-identification capabilities.

In light of data privacy regulations like GDPR, it’s important to have tools that can help you uncover and secure personal data. The DLP API is also built to work with your sensitive workloads and is supported by Google Cloud’s security and compliance standards. For example, it’s a covered product under our Cloud HIPAA Business Associate Agreement (BAA), which means you can use it alongside our healthcare solutions to help secure PII.

To illustrate how easy it is to plug DLP into your workloads, we’re introducing a new tutorial that uses the DLP API and Cloud Functions to help you automate the classification of data that’s uploaded to Cloud Storage. This function uses DLP findings to determine what action to take on sensitive files, such as moving them to a restricted bucket to help prevent accidental exposure.

In short, the DPI API is a useful tool for managing sensitive data—and you can take it for a spin today for up to 1 GB at no charge. Now, let’s take a deeper look at its capabilities and features.

Identify sensitive data with flexible predefined and custom detectors

Backed by a variety of techniques including machine learning, pattern matching, mathematical checksums and context analysis, the DLP API provides over 70 predefined detectors (or “infotypes”) for sensitive data like PII and GCP service account credentials.

You can also define your own custom types using:

  • Dictionaries — find new types or augment the predefined infotypes 
  • Regex patterns — find your own patterns and define a default likelihood score 
  • Detection rules — enhance your custom dictionaries and regex patterns with rules that can boost or reduce the likelihood score based on nearby context or indicator hotwords like “banking,” “taxpayer,” and “passport.”

Stream data from virtually anywhere

Are you building a customer support chat app and want to make sure you don’t inadvertently collect sensitive data? Do you manage data that’s on-prem or stored on another cloud provider? The DLP API “content” mode allows you to stream data from virtually anywhere. This is a useful feature for working with large batches to classify or dynamically de-identify data in real-time. With content mode, you can scan data before it’s stored or displayed, and control what data is streamed to where.

Native discovery for Google Cloud storage products

The DLP API has native support for data classification in Cloud Storage, Cloud Datastoreand BigQuery. Just point the API at your Cloud Storage bucket or BigQuery table, and we handle the rest. The API supports:

  • Periodic scans — trigger a scan job to run daily or weekly 
  • Notifications — launch jobs and receive Cloud Pub/Sub notifications when they finish; this is great for serverless workloads using Cloud Functions
  • Integration with Cloud Security Command CenterAlpha
  • SQL data analysis — write the results of your DLP scan into the BigQuery dataset of your choice, then use the power of SQL to analyze your findings. You can build custom reports in Google Data Studio or export the data to your preferred data visualization or analysis system.
A summary report of DLP findings on recent scans

Redact data from free text and structured data at the same time

With the DLP API, you can stream unstructured free text, use our powerful classification engine to find different sensitive elements and then redact them according to your needs. You can also stream in tabular text and redact it based on the record types or column names. Or do both at the same time, while keeping integrity and consistency across your data. For example, you can take a social security number that’s classified in a comment field as well as in a structured column, and it generates the same token or hash.

Extend beyond redaction with a full suite of de-identification tools

From simple redaction to more advanced format-preserving tokenization, the DLP API offers a variety of techniques to help you redact sensitive elements from your data while preserving its utility.

Below are a few supported techniques:

Transformation type
Description
Replacement
Replaces each input value with infoType name or a user customized value
Redaction
Redacts a value by removing it
Maskor partial mask
Masks a string either fully or partially by replacing a given number of characters with a specified fixed character
Pseudonymization
with cryptographic hash
Replaces input values with a string generated using a given data encryption key
Pseudonymization
with format preserving token
Replaces an input value with a “token,” or surrogate value, of the same length using format-preserving encryption (FPE) with the FFX mode of operation
Bucketvalues
Masks input values by replacing them with “buckets,” or ranges within which the input value falls
Extract time data
Extracts or preserves a portion of dates or timestamps

The Cloud DLP API can also handle standard bitmap images such as JPEGs and PNGs. Using optical character recognition (OCR) technology, the DLP API analyzes the text in images to return findings or generate a new image with the sensitive findings blocked out.

Measure re-identification risk with k-anonymity and l-diversity

Not all sensitive data is immediately obvious like a social security number or credit card number. Sometimes you have data where only certain values or combinations of values identify an individual, for example, a field containing information about an employee’s job title doesn’t identify most employees. However, it does single out individuals with unique job titles like “CEO” where there’s only one employee with this title. Combined with other fields such as company, age or zip code, you may arrive at a single, identifiable individual.

To help you better understand these kinds of quasi-identifiers, the DLP API provides a set of statistical risk analysis metrics. For example, risk metrics such as k-anonymity can help identify these outlier groups and give you valuable insights into how you might want to further de-identify your data, perhaps by removing rows and bucketing fields.

Use k-anonymity to help find identifiable individuals in your datasets

Integrate the DLP API into your workloads across the cloud ecosystem

The DLP API is built to be flexible and scalable, and includes several features to help you integrate it into your workloads, wherever they may be.

  • DLP templates — Templates allow you to configure and persist how you inspect your data and define how you want to transform it. You can then simply reference the template in your API calls and workloads, allowing you to easily update templates without having to redeploy new API calls or code.
  • Triggers — Triggers allow you to set up jobs to scan your data on a periodic basis, for example, daily, weekly or monthly. 
  • Actions — When a large scan job is done, you can configure the DLP API to send a notification with Cloud Pub/Sub. This is a great way to build a robust system that plays well within a serverless, event-driven ecosystem.

The DLP API can also integrate with our new Cloud Security Command Center Alpha, a security and data risk platform for Google Cloud Platform that helps enterprises gather data, identify threats, and act on them before they result in business damage or loss. Using the DLP API, you can find out which storage buckets contain sensitive and regulated data, help prevent unintended exposure, and ensure access is based on need-to-know. Click hereto sign up for the Cloud Security Command CenterAlpha.

The DLP API integrates with Cloud Security Command Center to surface risks associated with sensitive data in GCP

Sensitive data is everywhere, but the DLP API can help make sure it doesn’t go anywhere it’s not supposed to. Watch this space for future blog posts that show you how to use the DLP API for specific use cases.

IBM, HPE tout new A.I.-oriented servers

The content below is taken from the original ( IBM, HPE tout new A.I.-oriented servers), to continue reading please visit the site. Remember to respect the Author & Copyright.

IBM and Hewlett Packard Enterprise this week introduced new servers optimized for artificial intelligence, and the two had one thing in common: Nvidia technology.

HPE this week announced Gen10 of its HPE Apollo 6500 platform, running Intel Skylake processors and up to eight Pascal or Volta Nvidia GPUs connected by NVLink, Nvidia’s high-speed interconnect.

A fully loaded V100s server will get you 66 peak double-precision teraflops of performance, which HPE says is three times the performance of the previous generation.

The Apollo 6500 Gen10 platform is aimed at deep-learning workloads and traditional HPC use cases. The NVLink technology is up to 10 times faster than PCI Express Gen 3 interconnects.

To read this article in full, please click here

battoexeconverter (3.0.10.0)

The content below is taken from the original ( battoexeconverter (3.0.10.0)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Bat To Exe Converter can convert BAT (.bat) script files to the EXE (.exe) format.