Google Designs Data Center Appliance to Ship Client Data

The content below is taken from the original (Google Designs Data Center Appliance to Ship Client Data), to continue reading please visit the site. Remember to respect the Author & Copyright.

For a large enterprise, one of the costliest and most time-consuming steps in moving to the cloud is transferring the enormous amount of data stored in its on-premises data centers to its cloud provider’s data centers. Network bandwidth is a precious resource, and even when you have tons of it, moving petabytes of data over a WAN can take way longer than is practical.

Amazon Web Services solved this problem two years ago by introducing a service called Snowball. If you have lots of data you want to upload to AWS cloud storage, the company ships you a rugged storage appliance, which you connect to your internal network, upload your data to it, and ship it back to Amazon.

Today, Alphabet subsidiary Google announced beta launch of a similar service, taking another step in its effort to catch up to AWS and Microsoft Azure in the enterprise cloud market. The service, creatively named Transfer Appliance, is slightly cheaper per TB than AWS Snowball, although the exact price difference will depend on your specific shipping costs.

Google’s time estimates for transferring data over networks with varying bandwidth. Click chart to enlarge (Image: Google)

Another difference is in the design of the appliance itself. Besides storing more data, Google’s Transfer Appliance is designed to be mountable in a standard 19-inch data center rack, while the Snowball looks more like a PC tower built for an active battlefield.

Each cloud provider offers two models of its data migration device. The two Transfer Appliance options are 100TB in a 2U box and 480TB in a 4U box. Snowball has a 50TB and an 80TB option.

The Google service costs $3 per TB or $3.75 per TB, depending on which of the two versions of the appliance you select. Curiously, the higher-volume version of the appliance commands the higher per-TB price.  You’re also responsible for shipping the Transfer Appliance (the service uses FedEx), which will run you about $500 for the 100TB model and $900 for the 480TB one.

Amazon’s Snowball service costs $4 per TB for the 50TB model or $3.12 per TB for the 80TB one. You pay for shipping to an Amazon facility too, and unlike Google, Amazon doesn’t provide set pricing for shipping, saying it will depend on your location and the shipping option you choose (e.g. 2-day or overnight).

Amazon also made the shipping process itself easier by mounting its Kindle readers on the appliances. When a customer is done uploading their data to the device, the screen on the Kindle automatically displays the correct shipping label, and a shipping company is notified that the device is ready to be picked up.

In Google’s case, the customer has to email support to request a shipping label and wait until it arrives in the mail before they can ship the appliance.

Google may be outsourcing some portion of the data migration service to the enterprise data management giant Iron Mountain, although we weren’t able to confirm this. “Google doesn’t publicly disclose this information,” a company spokesperson said in an email.

Documentation for the Transfer Appliance includes instructions for users to grant permissions on their staging bucket in Google cloud storage to [email protected]. That Iron Mountain is an official Google Cloud Platform partner is public information, but public information available about the partnership is limited only to a service for transferring customer data stored on tape in Iron Mountain facilities to Google’s cloud data centers.

India’s first solar-powered train makes its debut

The content below is taken from the original (India’s first solar-powered train makes its debut), to continue reading please visit the site. Remember to respect the Author & Copyright.

India’s diesel-powered train network has a new kid on the block. The gas-guzzling Indian Railway system has just debuted its first solar-powered train, called the Diesel Electric Multiple Unit (DEMU). It will operate in the city of New Delhi.

Placement of the solar panels on the train car was challenging. Sandeep Gupta, Vice Chairman and Managing Director of Jakson Engineers Limited (the company that produced and installed the solar panels) told Business Standard, "It is not an easy task to fit solar panels on the roof of train coaches that run at a speed of 80 km per hour." The panels feed into an onboard battery that can store surplus power.

The train will still be pulled by a diesel locomotive; the solar panels will only power passenger comfort systems, such as lights, information displays and fans. Even so, Indian Railways estimates that just one train with six solar-panel equipped cars will save 21,000 liters (5,547 gallons) of diesel fuel per year, at a cost savings around Rs12 lakh (almost $20,000).

Indian Railways is the largest rail network in Asia, running around 11,000 trains daily. The service moves roughly 13 million passengers every day. That translates to incredibly large fuel bills; in 2015, the service spent Rs16,395 crore ($2.5 billion) on diesel. They’ve been trying to reduce their fuel consumption, in part by more reliance on solar energy; the hope is that it will save them Rs41,000 crore ($6.31 billion) over the next 10 years.

Via: Quartz

Source: Business Standard

List of Microsoft Apps available for Android

The content below is taken from the original (List of Microsoft Apps available for Android), to continue reading please visit the site. Remember to respect the Author & Copyright.

List of Microsoft Apps available for Android

When you think about Microsoft, the first thing that comes to your mind is maybe Windows and Office. But the company has been working hard to gain popularity on other platforms like Android and iOS as well. In this post, we’ve listed the applications developed by Microsoft  for Android.

Microsoft Apps for Android

Microsoft Apps for Android

As of date, there are 96 apps that Microsoft has developed for the Android platform. Some of them have been listed below:

  1. Microsoft Apps: A hub to view and download all other Android applications released by Microsoft.
  2. Arrow Launcher: A free launcher application that is productivity focused and comes with a beautiful design.
  3. Microsoft Word: This application lets you view, edit and create new Word documents. It has a beautiful interface and is the most widely used Android application by Microsoft.
  4. Microsoft Excel: An application to manage Excel spreadsheets on your phone.
  5. Microsoft PowerPoint: Create, Edit and View PPTs on the go with this application.
  6. Microsoft Outlook: Official Microsoft backed email client that works with all major email providers.
  7. Skype for Business: Extends the functionality of Lync and Skype for your business accounts.
  8. Microsoft Office Mobile: Office application for Android OS before version 4.4.
  9. Microsoft OneDrive: Android client to view, upload and modify your files from your OneDrive storage account.
  10. Cortana for Android(Early-Access): This application brings Microsoft’s digital assistant Cortana to Android devices.
  11. Office Lens: A document scanning utility that links with your Microsoft Account.
  12. OneNote: Part of the office suite, a note taking application linked to your Microsoft account so that you can work anywhere.
  13. Intune Company Portal: Mobile client for installing company apps and remotely managing the device enrolled to Microsoft Intune.
  14. Microsoft Bing Search: The Bing search engine client for Android devices. Works with voice commands and quick actions.
  15. OWA for Android(Pre-Release): E-mail client for mailboxes on Office 365 for Business.
  16. Next Lock Screen: A lock screen application intended to save time and quickly take actions through the lock screen.
  17. Microsoft Translator: Translates text, voice, and images into more than 60 languages on your phone.
  18. Microsoft Hyperlapse Mobile: Create stunning time-lapse videos easily.
  19. Microsoft Authenticator: A helper Android application to facilitate two-step authentication.
  20. Microsoft Remote Desktop: Remotely access your Windows computers from anywhere.
  21. Kaizala: One of the Microsoft’s garage projects. It is a wonderful group and individual chat application that focuses on productivity.
  22. MSN Money: Get stock quotes, prices, market data and news from MSN.
  23. Xbox: Connect with your Xbox friends and see what they are playing. A complete application for the Xbox community.
  24. Age of Empires: Castle Siege: A fun strategy game where you can create your own kingdom and defeat your enemies.
  25. Lync 2010: Mobile client for Lync Server.
  26. Microsoft To-Do: A great cross-platform cloud based task manager linked to your Microsoft Account.
  27. Face Swap: A fun application that lets you swap your face with other people.
  28. Microsoft Teams: A chat centered productivity application for companies using Office 365.
  29. Microsoft Solitaire Collection: Bring back the worlds #1 classic solitaire game back to your Android device.
  30. MSN Weather: A weather application to view daily weather forecasts and other conditions.
  31. SMS Organizer: A smart SMS inbox decluttering tool. Comes with features like SMS reminders and more.
  32. Wordament: A free online fun word game that can be played with friends and other people online.
  33. MSN News: Read breaking news and stories from around the world, provided by MSN.
  34. Keyboard for Excel: A garage project that provides an awesome keyboard to enter numeric and other data into Excel application.
  35. Microsoft Selfie: An intelligent selfie image enhancement tool available on all platforms.
  36. Picturesque Lock Screen: Another lock screen app that provides notifications, contacts and Bing images right on your lock screen.
  37. Xbox 360 SmartGlass: An Xbox 360 companion application that lets you control your console using your mobile.
  38. Switch to Windows: An application that lets you transfer contacts and other data if you are moving to Windows Phone.
  39. Microsoft SharePoint: Connect to your Office 365 SharePoint websites using this app.
  40. Office 365 Admin: An account manager of Office 365 administrator accounts.
  41. Intune Managed
  42. Microsoft Planner: A free task manager and teamwork organizer for Office 365 users.
  43. Office Remote: Turn your phone into wireless remote that can control Microsoft Office on your computer.
  44. Microsoft Power BI: Monitor and access your business data anywhere and interact with your Power BI dashboards.
  45. Outlook Group: Collaborate with your team and do more with your Office 365 work or school account.
  46. Xbox Beta: The public beta release of the upcoming Xbox app with new functionality.
  47. Xbox One SmartGlass: This app allows the user to gain remote access to their Xbox One.
  48. Microsoft Azure: Manage your Azure account from anywhere. Monitor resources and resolve issues in no time.
  49. Dynamics 365: Provides professionals with the best tools for engaging customers and maintain their data.
  50. Microsoft Remote Desktop Beta: Remotely manage your Windows Computers.
  51. My Apps: Connect to the apps that you already use at work or school.
  52. Microsoft Channel 9: Browse and watch various developer videos from the MSDN’s Channel 9 website.
  53. Microsoft Flow: Automate Tasks and design your own workflows to do more.
  54. Azure Information Protection: Lets you view protected files that other people have shared with you.
  55. Mimicker Alarm: Another Garage Project app, an alarm clock application that wakes you up by playing a small game.
  56. Office Delve for Office 365: Connect with your work mate and see what they are working on across Office 365.
  57. MSN Sport: Get sports news, score and watch sports related videos provided by MSN.
  58. PowerApps: For business users, get your custom apps that were shared with you.
  59. Bing Ads: Monitor and edit your Bing ad campaigns directly from your phone.
  60. Dynamics CRM: Manage customer information and prepare well for your appointments.
  61. Snap Attacks: A fast paced word-building game that can be played online.
  62. Microsoft Classroom: For Office 365 Education users, this app helps you manage your assignments and notes easily.
  63. Microsoft Wi-Fi: A Wi-Fi companion application for people visiting Seattle Center.
  64. Sprightly: Scan documents, create professional catalogs and videos for your business.
  65. O365 Message Encryption Viewer: View and reply to encrypted OME messages from your phone.
  66. Microsoft IT Showcase:
  67. Clip Layer: A garage project application that lets you select, copy and save snippets from any screen.
  68. Microsoft Startup Directory: Learn and connect with startups that Microsoft is currently working with.
  69. Citizen Next: An application attempt to make submitting problems to municipal corporations easier.
  70. Send: It is an email client that acts like a messenger client
  71. Microsoft Tech: Get updates and news about latest Microsoft technology and events.
  72. News Pro: Get in-depth news curated to your interests.
  73. Microsoft StaffHub: Manage your staff and schedule shifts easily.
  74. Microsoft IT Showcase: Read about the technology used inside Microsoft.
  75. Hub Keyboard: This keyboard brings information from different services to your fingertips.

The list is pretty big, and most of the applications are pretty useful. Some of them are garage projects, and some are in beta stage. We have not covered a few not so popular applications, so you may want to check out the Play Store for a complete list of applications.

Lavish loves to follow up on the latest happenings in technology. He loves to try out new Windows-based software and gadgets and is currently learning JAVA. He loves to develop new software for Windows. Creating a System Restore Point first before installing a new software is always recommended, he feels.

Now you can get a bachelor’s degree in data center engineering

The content below is taken from the original (Now you can get a bachelor’s degree in data center engineering), to continue reading please visit the site. Remember to respect the Author & Copyright.

In an era where all the hot tech jobs seem to focus on application development and cloud computing, it can be hard to find fresh data center engineering talent. The Institute of Technology in Sligo, Ireland, is trying to rewrite that story with a new Bachelors Degree in Data Center Facilities Engineering, starting this fall.

According to the school, “The purpose of this new engineering degree programme is to provide the Data Centre industry with staff who are qualified to provide the proficient and in-depth skills necessary for the technical management and operation of data centre facilities. Expert operation and maintenance of these facilities is crucial in order to maintain 24/7 services with optimum energy efficiency.”

Online classes, but you still have to go to Belgium

While the school may be in Ireland, the degree is offered online, in English, with lab sessions in Haute École Louvain en Hainaut (HELHa) in Mons, Belgium.  Students will have to show up there on specified days, which obviously tilts the program toward European students. Still, it provides a template that could be followed by schools around the world, creating cadres of new data center staff members without the need for intensive on-the-job training. 

Because the course was reportedly developed after 18 months of consultations with Google, Facebook and Microsoft, it’s not entirely clear to what extent the course will focus on traditional enterprise data center practices or if it will concentrate on the needs of the web giants. 

Creating a pool of entry-level workers?

But in an industry where technology graduates typically receive a more general education—perhaps bolstered with a short specialized course in data centers or other disciplines—the mere fact it exists also comes as a vote of confidence for careers in data center engineering. To be fair, there is a master’s program in data center engineering at SMU’s Lyle School of Engineering in Dallas, but that requires students to already have their B.S. in engineering or a related field. The IT Sligo course can be expected to turn out younger, presumably less expensive, data center workers who are better suited for entry-level positions.

IT Sligo President Dr. Brendan McCormack told Irish Tech News, “We are proud to be helping the data centres to address a specific skill need which several of the world’s leading tech companies recognise and value.”

Denis Browne, Google’s EU regional data center lead, added, “Google’s data centers are some of the best in the world, and we look for the best talent to work with us. Thanks to IT Sligo and HELha, this online course will increase the skills of people already working in the sector and for those who wish to work in the industry going forward.”

In four years or so, we’ll have a better idea of how it’s all working out.

Join the Network World communities on

Facebook

and

LinkedIn

to comment on topics that are top of mind.

Data-collecting benches are making their way into cities

The content below is taken from the original (Data-collecting benches are making their way into cities), to continue reading please visit the site. Remember to respect the Author & Copyright.

  • anchor

    Data-collecting benches are making their way into cities


    Anastasia Tokmakova

    Jul 17, ’17 4:47 PM EST


    Image courtesy of Soofa

    Image courtesy of Soofa

    A pair of USB ports on a console on the front of the bench provides juice from the solar panel mounted at lap level between the seats. Who wouldn’t want to hang out at a bench like this? It certainly catches the eye of passersby. What these kids might not realize, however, is that this bench is watching them back.
    Landscape Architecture Magazine

    “Smart” benches are spreading—recently a series of them, manufactured by Soofa, was installed in a tiny neighborhood park next to I-77 on the north end of Charlotte, North Carolina with the intent of the neighborhood’s analysis and redevelopment. 

    Soofa, founded in 2014 by three graduates of MIT Media Lab, is one of a handful of companies designing data-collecting street furniture. Their solar-powered benches register Wi-Fi enabled devices within 150 feet of them, sending data back to an office building in East Cambridge, Massachusetts. While the sensors can’t access personal information from your phone, they pick up and remember your devices’ MAC address. The technology allows cities and urban planners to count users of various public spaces, identifying when and for how long they’re visited, and potentially optimizing their design. 

    “The line between collecting data for a valid public purpose and the unreasonable surveillance of private citizens can be tough to tease out. Beyond clear dangers like hacking and data breaches, and underlying concerns about private corporations somehow benefiting from data collected on the taxpayer’s dime, are existential questions about privacy as a basic human right. “


     

  • The Acorn Archimedes At 30

    The content below is taken from the original (The Acorn Archimedes At 30), to continue reading please visit the site. Remember to respect the Author & Copyright.

    The trouble with being an incidental witness to the start of something that later becomes world-changing is that at the time you are rarely aware of what you are seeing. Take the Acorn Archimedes, the home computer for which the first ARM processor was developed, and which has just turned 30. If you were a British school pupil in 1987 who found a pair of the new machines alongside the row of BBC Micros in the school computer lab, for sure it was an exciting event, after all these were the machines everyone was talking about. But the possibility that their unique and innovative processor would go on to spawn a line of successors that would eventually power so much of the world three decades later was something that probably never occurred to spotty ’80s teens.

    [Computerphile] takes a look at some of the first Archimedes machines in the video below the break. We get a little of the history and a description of the OS, plus a look at an early model still in its box and one of the last of the Archimedes line. Familiar to owners of this era of hardware is the moment when a pile of floppies is leafed through to find one that still works, then we’re shown the defining game of the platform, [David Braben]’s Lander, which became the commercial Zarch, and provided the template for his Virus and Virus 2000 games.

    The Trojan Room Coffee Cam Archimedes, on display at the Cambridge University Computing Department.The Trojan Room Coffee Cam Archimedes, on display at the Cambridge University Computing Department.

    We see the RiscOS operating system booting lightning-fast from ROM and still giving a good account of itself 20 years later even on a vintage Philips composite monitor. If you were that kid in 1987, you were in for a shock when you reached university and sat down in front of the early Windows versions, it would be quite a few years before mainstream computers matched your first GUI.

    The Archimedes line and its successors continued to be available into the mid 1990s, but faded away along with Acorn through the decade. Even one being used to power the famous Trojan Room Coffee Cam couldn’t save it from extinction. We’re told they can still be found in the broadcast industry, and until fairly recently they powered much of the electronic signage on British railways, but other than that the original source of machines has gone. All is not lost though, because of course we all know about their ARM joint venture which continues to this day. If you would like to experience something close to an Archimedes you can do so with another computer from Cambridge, because RiscOS is available for the Raspberry Pi.

    Sit back and enjoy the video, and if you were one of those kids in 1987, be proud that you sampled a little piece of the future before everyone else did.

    VIDEO

    Thanks [AJCC] for the tip.

    Archimedes header image: mikkohoo, (CC BY-SA 4.0).

    Flat microscope for the brain could help restore lost eyesight

    The content below is taken from the original (Flat microscope for the brain could help restore lost eyesight), to continue reading please visit the site. Remember to respect the Author & Copyright.

    You’d probably prefer that doctors restore lost sight or hearing by directly repairing your eyes and ears, but Rice University is one step closer to the next best thing: transmitting info directly to your brain. It’s developing a flat microscope (the creatively titled FlatScope) that sits on your brain to both monitor and trigger neurons modified to be fluorescent when active. It should not only capture much more detail than existing brain probes (the team is hoping to see "a million" neurons), but reach levels deep enough that it should shed light on how the mind processes sensory input. And that, in turn, opens the door to controlling sensory input.

    FlatScope is part of a broader DARPA initiative that aims to create a high-resolution neural interface. If technologies like the microscope lead to a way to quickly interpret neuron activity, it should be possible to craft sensors that send audiovisual data to the brain and effectively take over for any missing senses. Any breakthrough on that level is a long way off (at best) when even FlatScope exists as just a prototype, but there is some hope that blindness and deafness will eventually become things of the past.

    Source: Rice University

    Here’s what Atari’s upcoming Ataribox console will look like

    The content below is taken from the original (Here’s what Atari’s upcoming Ataribox console will look like), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Retro consoles are the new next-gen consoles, and nothing’s more retro console than Atari. That’s why the teases from the gaming company about its upcoming ‘Ataribox’ have been so intriguing to gaming fans – it could be amazing. Now, we know what it looks like, and thanks to an email update (via The Verge), also broadly what it will be able to do.

    The design is clearly an homage to the Atari of yore, but it’s also not a straight up miniaturization like the NES Classic. Instead, it inherits some of the materials (there’s a woodgrain option and a black glass front, depending on your preference. There are also ports for an SD card, HDMI, and four USB, and the company will be offering classic games on the console, similar to the NES classic’s library.




    But the Ataribox will also be able to run “current” games, so it could be more like a modern set-top gaming device, too. We don’t yet know much about what that’s going to offer on that scale, but it’d be interesting if this was essentially a Shield-like Android TV device with a host of retro Atari titles pre-loaded and some media streaming capabilities.

    Nothing yet on final availability or pricing, but it’s still an intriguing project to keep an eye on – and one which could indicate the true depth of the retro gaming fad’s appeal.

    Enable Storage Sense in Windows 10 Creators Update

    The content below is taken from the original (Enable Storage Sense in Windows 10 Creators Update), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Ensuring that there is enough free disk space to install updates and keep Windows running smoothly can be a problem on devices with solid state storage. New in Windows 10 Creators Update, Storage Sense can automatically delete unnecessary files to maintain a healthy level of free disk space. In this Ask the Admin, I will show you how to turn it on.

     

     

    As solid-state disks (SSDs) become more common in all types of Windows 10 devices, there is often no reason to have massive amounts of storage on your device while working with cloud solutions like Office 365 and Google G Suite. A 256GB SSD should be plenty for most Windows 10 installations with a modest set of installed applications. There are always exceptions. If you edit video or need to keep tons of media available offline, 256GB might be stretching it. Most business users can make do with the minimum of local storage.

    In theory, 256GB should be enough storage most of the time. It might require either the user or an IT admin to perform periodic maintenance to ensure Windows has adequate free space to work properly and install updates. This is where Storage Sense can help.

    Sponsored

    Windows Defender and Storage

    What do security and storage have to do with each other? In the Creators Update, Microsoft added a new user interface for Windows Defender that alerts you about device health and performance, including issues with storage capacity. If the device is low on disk space, critical updates might not install.

    New Windows Defender UI in Windows 10 Creators Update (Image Credit: Russell Smith)

    New Windows Defender UI in Windows 10 Creators Update (Image Credit: Russell Smith)

    Storage Sense

    If you get a warning from Windows Defender about storage, then the Open Settings button will take you to Storage in the Settings app. From there, you can enable Storage Sense, which can automatically delete temporary files that apps are not using and any files that have been in the Recycle Bin more than 30 days. Additionally, you can click Clean now to force Storage Sense to run right away. You can also open the Storage page in the Settings app directly by typing storage into the search box in the bottom left of the taskbar and selecting Storage from the list of results.

    Storage sense in Windows 10 Creators Update (Image Credit: Russell Smith)

    Storage Sense in Windows 10 Creators Update (Image Credit: Russell Smith)

    Sponsored

    There is no setting to enable Storage Sense using Group Policy at this time. And while this new storage setting is a welcome addition to Windows, we will have to wait until the Fall Creators Update to get On-Demand Sync in One Drive and One Drive for Business. This is a new and improved version of the OneDrive placeholder files that were available in Windows 8.1. This allows users to sync files as needed, as opposed to having to sync entire folders from the cloud. In my experience, temporary files and the Recycle Bin certainly add to the problem but files synced from OneDrive are just as likely to cause free space issues.

    The post Enable Storage Sense in Windows 10 Creators Update appeared first on Petri.

    Here’s How Azure Stack Will Integrate into Your Data Center

    The content below is taken from the original (Here’s How Azure Stack Will Integrate into Your Data Center), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Azure Stack, the turnkey hybrid cloud system that you can now order from server vendors like Dell EMC and Hewlett Packard Enterprise or get as a managed service from providers like Avanade on hardware in your own data center, is intended to be concrete proof of Microsoft’s view that cloud is an operating model and not a place. It’s obviously designed to let you integrate private and public cloud services – but how well will it fit into your existing infrastructure?

    What it gives you is a system that’s not exactly the same as Azure running in an Azure data center but that’s consistent with it, using the same management API and portal, with many of the same services, giving you a unified development model. Think of it as a region in Azure. Not all Azure regions have exactly the same services available, but they all get the core services, ranging from storage, IaaS, and Azure Resource Manager to Key Vault, with Azure Container Service and Server Fabric coming to Azure Stack next year. Some public Azure services may never make it to Azure Stack, because some things only make sense at hyper-scale.

    Compliance, Performance, Data

    You can use Azure Stack to run cloud workloads that you don’t want in the public cloud for compliance reasons – the most common consideration when businesses weigh cloud services. That includes both the Azure services and third-party PaaS and IaaS workloads, such as Cloud Foundry, Kubernetes, Docker Swarm, Mesosphere DC/OS, and open source stacks like WordPress and LAMP, which come as services from the Azure Marketplace rather than bits you download, install, and configure manually. Just as interesting is the ability to use cloud tools and development patterns without the latency of an internet connection – whether you have poor connectivity (on oil rigs and cruise ships, in mines, and other challenging locations) or need to process sensor data in near-real-time.

    The hybrid option is going to be the most powerful. You can use Azure services like IoT Event Hubs and Cognitive Services APIs with serverless Functions and Azure Stack to build an AI-powered system that can recognize authorized workers and unauthorized visitors on your construction site and warn you when someone who’s not certified is trying to use dangerous machinery. Microsoft and Swedish industrial manufacturer Sandvik showed a prototype of that at the Build conference this year.

    That’s the kind of system you’d usually choose to build on a cloud platform, because setting up IoT data ingestion, data-lake, and machine learning systems you’d need before you could even start writing code would be a complex and challenging project. With Azure Stack, developers can write hybrid applications that integrate with services in the public Azure cloud that can be a first step in an eventual migration (if the issue is data residency and a cloud data center opens in the right geography), or to augment a system you never plan to put in the public cloud, and have the same DevOps process covering both environments.

    Image: Microsoft

    You can also use Azure Stack to run existing applications, especially if you want to start containerizing and modernizing them to move from monolithic apps to microservices. “You can connect to existing resources in your data center, such as SQL or other databases via the network gateway that is included in Azure Stack,” Natalia Mackevicius, director of Azure infrastructure management solutions, explained in an interview with Data Center Knowledge.

    But even if you’re using Azure Stack to virtualize existing applications, you’re going to be managing it in a very different way from your existing data center infrastructure – even if that includes Microsoft’s current Azure Pack way of offering cloud-style services on premises.

    Step Away from the Servers

    Azure Stack does integrate with your existing tools. When you set it up, you can choose whether to manage access using Azure Active Directory in a hybrid cloud situation, or Active Directory Federation Services if it’s not going to be connect to the public cloud.

    But you never do most of the setup you would with most servers. Network configuration happens automatically when you connect the switches in Azure Stack to your network, for example. “Customers complete a spreadsheet with relevant information for integration into their environment with information, such as the IP space to be used and DNS. When Azure Stack is deployed, the deployment automation utilizes this information to configure Azure Stack to connect into the customer’s network,” Mackevicius said.

    You won’t monitor Azure Stack like a normal server cluster because much of what an admin would normally do is automated and taken care of by the Infrastructure system. But there are REST APIs for monitoring and diagnostics – as well a System Center Operations Manager management pack for Azure Stack and a Nagios extension – so you can use your usual monitoring tools. Server vendors like HPE are using those APIs to integrate Azure Stack into their own tools, so if you already use HPE OneView, for example, you can manage Azure Stack compute, storage, and networking through that.

    “The switches in Azure Stack can be configured to send alerts and so on via SNMP, for example, to any central network monitoring tools,” Mackevicius said. “Each Azure Stack integrated system also has a Hardware lifecycle host (HLH), where the hardware partner runs software for hardware management, which may include tools for power management.”

    The portal on Azure Stack lets you manage the VMs that you’re running on it (and with the Windows Azure Pack Connector for Azure Stack, you can also manage VMs running on your existing infrastructure on Azure Pack), but not the IaaS service that runs them. “You can use monitoring tools such as System Center Operations Manager or Operations Management Suite to monitor IaaS VMs in Azure or Azure Stack in the same way you monitor VMs in your data centers.”

    Backup and DR

    For backup and DR, you need to think both about tenant workloads and the infrastructure for Azure Stack itself. Microsoft suggests Azure Backup and Azure Site Recovery for replication and failover, but that’s not the only option. “Tenant assets can use existing backup and DR tools such as Veeam, Commvault, Veritas Backup products,” or whatever other systems you already have in place.

    “For [its own] infrastructure, Azure Stack includes a capability which takes a periodic snap of the relevant data and places it on an externally configurable file share,” Mackevicius explained. That stores metadata like subscription and tenant-to-host mapping. so you can recover after a major failure, and you can use regions within your Stack deployment for scale and geo-redundancy.

    Updates on Your Own Schedule

    Updating is also very different. Updates to the Azure services and capabilities will come whenever they’re ready; updates for the Azure Stack infrastructure will come regularly, but that’s updates to infrastructure management. Even though Azure Stack runs on Windows Server, you’re not going to sit there testing and applying server patches. What Microsoft calls ‘pre-validated’ updates are delivered automatically, and what you control is when they’re applied, so they happen during your chosen maintenance window.

    Getting updates to be seamless and stress-free is why Microsoft turned to specific hardware partners rather than letting customers build DIY Azure Stack configurations. “Sure, you can get it up and running … but then you need everything to update, and by the way, that needs to happen while all the tenants continue to run,” explained Vijay Tewari of the Azure Stack team. “The thing people fixate on is getting the initial deployment right, but this is about the full operational lifecycle, which is a much bigger proposition.”

    That’s one of the reasons to bring cloud to your data center in the first place. “We have a highly simplified model of operation. We don’t want our customers spending inordinate amount of their resources, time, or money just trying to keep the infrastructure running. That’s not where the value of Azure comes from; it comes from innovative services, whether it’s Service Fabric, whether it is SQL DB, or Azure Machine Learning.”

    Azure Stack gives you the option of taking advantage of that cloud value without having to give up the value you get from your own data centers, but you will be doing things differently.

    Azure Adds D_v3 and E_v3 Virtual Machine Series

    The content below is taken from the original (Azure Adds D_v3 and E_v3 Virtual Machine Series), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Microsoft launched 2 new series of virtual machines in July, the D_v3 and the E_v3, which are successors to the D_v2-Series. There are some interesting new firsts with these series. In this article, I will discuss the features of these new series and how this impacts the promotional pricing of the D_v2-Series virtual machines.

    Successor Series

    This is not the first time that Microsoft has launched a successor series of virtual machines. In the past, the Standard A_v2-Series replaced the Standard A-Series with a more recognizable set of sizes and lower costs. We have seen the D_v2-Series come in with newer hardware and (eventually) lower costs than the original D-Series.

     

     

    Recently, Microsoft has started to split the categorization of the D_v2-Series machines into two groupings:

    • General purpose: A normal balance of CPU to RAM, including the D1_v2 to D5_v2 machines
    • Memory optimized: Higher than normal RAM, including the D11_v2 to D15_v2 machines

    This split is a little confusing. Instead of continuing this split of the D-Series, Microsoft has decided to replace the D_v2 machines in the memory optimized category with a new E_v3 series. The general purpose D_v2 virtual machines are replaced by the new D_v3 virtual machines.

    Host Changes

    The D_v2-Series machines were based on a 2.4GHz Intel Xeon E5-2673 v3 (Haswell) processor, which is capable of bursting up to 3.1GHz with Intel Turbo Boost Technology 2.0. The D_v3 and the E_v3-Series machines are based on the newer 2.3GHz Intel XEON ® E5-2673 v4 (Broadwell) processor, which can achieve up to 3.5GHz. This is also with Intel Turbo Boost Technology 2.0, which is an extra .4GHz.

    Sponsored

    Since last year, the two new series have departed from the multiples of 1.75GB RAM and have moved to using recognizable quantities of CPU and RAM. The name of the virtual machine size indicates the number of processors and RAM amounts are a multiple of the vCPU count. For example:

    • The D2_v3 has 2 virtual processors and 8GB (x4) RAM.
    • The D4_v3 has 4 virtual processors and 16GB (x4) RAM.

    The D_v3 and E_v3 also introduce the usage of Intel Hyperthreading. Past machines did not use Hyperthreading. As a result, you should notice a significant (up to 28 percent, according to Microsoft) price reduction between the regular price of the D_v2-Series and the D_v3-Series (West US 2 region):

    • D2_v2 with 2 cores and 7GB RAM costs $108.63 per month.
    • D2_v3 with 2 virtual CPUs and 8GB RAM costs $74.40 per month.

    Nested Virtualization

    Another significant change is that the v3 virtual machines and the M-Series run on hosts that are powered by Windows Server 2016 (WS2016) Hyper-V. That does not mean all that much by itself but it does mean that some WS2016 features might start to appear over time. The first of these new features is the support of nested virtualization. You can run Hyper-V virtual machines inside of the Azure (Hyper-V) virtual machines.

    I would not start by saying, “Hey let’s run production Hyper-V clusters on Azure,” but there will be some interesting scenarios:

    • Those of us that need accessible pay-as-you-go demo and training labs without expensive hardware have a new option.
    • Azure virtual machines can host Hyper-V containers. This is probably the core scenario that Microsoft was focusing on.

    Availability

    My guess is that new hardware is being deployed to support these two new series of virtual machines. This means that availability is limited to a few regions but this will grow over time:

    • West US 2
    • East US 2
    • West Europe
    • Southeast Asia

    D_v2 Promotional Pricing

    Microsoft offered promotional pricing for the D_v2-Series ahead of the launch of the D_v3- and E_v3-Series machines. This was probably done to seed the adoption. The pricing of the older series was to roughly match the two new ones.

    Microsoft will be winding down this promotional offer in regions where the D_v3 and E_v3 are now available. Customers will be able to deploy the D_v2 promotional machines until August 15th in the above regions. In the remaining regions without D_v3 and E_v3 availability, the promotional offer will continue until the new series are launched.

    Sponsored

    All currently deployed D_v2_promo virtual machines (a specific set of SKUs) will continue to be billed using the promotional pricing until June 30, 2018. At that time, the machines will revert back to D_v2 pricing. You should either stick with D_v2 machines or upgrade them to D_v3 or E_v3 machines.

    The post Azure Adds D_v3 and E_v3 Virtual Machine Series appeared first on Petri.

    Google Cloud Platform now open in London

    The content below is taken from the original (Google Cloud Platform now open in London), to continue reading please visit the site. Remember to respect the Author & Copyright.

    By Dave Stiver, Product Manager, Google Cloud Platform

    Starting today, Google Cloud Platform (GCP) customers can use the new region in London (europe-west2) to run applications and store data in London. London is our tenth region and joins our existing European region in Belgium. Future European regions include Frankfurt, the Netherlands and Finland.

    Incredible user experiences hinge on performant infrastructure. GCP customers throughout the British Isles and Western Europe will see significant reductions in latency when they run their workloads in the London region. In cities like London, Dublin, Edinburgh and Amsterdam, our performance testing shows 40%-82% reductions in round-trip time latency when serving customers from London compared with the Belgium region.

    We’ve launched London with three zones and the following services:

    The London region puts the control over how to deploy resources directly in the hands of GCP customers  giving them choice in some GCP services on where to run their applications and store their data. When a customer signs up for GCP services, they have three different options, depending on the service:

    1. Regional: Run applications and store data in a specific region, e.g., London, Tokyo, Iowa, etc.
    2. Multi-regional: Distribute applications and storage across two or more cloud regions on a given continent, e.g., Americas, Asia or Europe.
    3. Global: Distribute applications and store data globally across our entire global network for optimal performance and redundancy.

    In addition, we’ve worked diligently over the last decade to help customers directly address EU data protection requirements. Most recently, Google announced a commitment to GDPR compliance across GCP. The General Data Protection Regulation (GDPR), which takes effect on May 25, 2018, is the most significant piece of European privacy legislation in the last 20 years.

    "Google’s decision to choose London for its latest Google Cloud Region is another vote of confidence in our world-leading digital economy and proof Britain is open for business. It’s great, but not surprising, to hear they’ve picked the UK because of the huge demand for this type of service from the nation’s firms. Earlier this week the Digital Evolution Index named us among the most innovative digital countries in the world and there has been a record £5.6bn investment in tech in London in the past six months.

    Karen Bradley, Secretary of State for Digital, Culture, Media and Sport

    "At WP Engine, we look forward to extending our digital experience platform to an even broader set of our 10,000 European customers who want to be hosted on Google Cloud Platform based in the London region. We are excited about bringing reduced latency benefits from the ability to store and process data in London to our UK customers."  

     Jason Cohen, Founder and CTO

    "The Telegraph benefits greatly from Google Cloud’s global scale and is pleased to see continued investment from Google Cloud in the UK. We look forward to working with them closely as they expand their business in the UK and Europe."  

     Toby Wright, CTO, The Telegraph

    "Google Cloud enables Revolut to try new ideas and stay agile while providing secure, reliable services for our customers at scale."  

    Vladyslav Yatsenko, Co-founder & CTO, Revolut

    For the latest on the terms of availability for services from this new region as well as additional regions and services, visit our London region page or locations page. For guidance on how to build and create highly available applications, take a look at our zones and regions page. Give us a shout to request early access to new regions and help us prioritize what we build next.

    We’re excited to see what you’ll build on top of the new London region!

    HPE ProLiant Gen10 Featuring Intel Xeon Scalable Processors Launched

    The content below is taken from the original (HPE ProLiant Gen10 Featuring Intel Xeon Scalable Processors Launched), to continue reading please visit the site. Remember to respect the Author & Copyright.

    HPE DL380 Gen10 StackHPE DL380 Gen10 Stack

    HPE is synonymous with data center computing. The HPE ProLiant Gen10 was shown off at Discover but now with the launch of the Intel Xeon Scalable Processor Family launch, the company can start marketing the servers to the general public. HPE is one of the top volume vendors of servers so whenever it unveils a new generation of servers, it is time to take notice.

    HPE ProLiant DL380 Gen10 Example

    Like we have seen with Dell EMC and other vendors, HPE seems to be introducing updates to its Gen9 line in waves. One of the exciting platforms we are seeing in the first wave of launches in the HPE ProLiant DL380 Gen10 replete with a gorgeous new faceplate.

    HPE DL380 Gen10 FrontHPE DL380 Gen10 Front

    The HPE ProLiant DL380 is designed to be highly configurable for a user’s workload. There are different front storage options, multiple networking, and storage controller flavors available.

    HPE DL380 Gen10 Interior With Skylake SPHPE DL380 Gen10 Interior With Skylake SP

    As you can see, the HPE ProLiant DL380 has many customization points and can be used with the range of Intel Xeon Scalable Processor Family CPUs. The HPE ProLiant DL380 Gen10 can handle up to 192GB of persistent memory. HPE is making a major push toward NVDIMMs with this generation.

    Other key HPE ProLiant Gen10 launch week servers are the HPE ProLiant DL360 Gen10 which is a 1U dual socket server and the HPE ProLiant DL560 Gen10. The DL560 Gen10 is a 2U, quad socket server which means one has access to performance like we saw in our Quad Intel Platinum Xeon 8176 initial benchmarks in only 2U. For comparison our test system from the Intel OEM business was 4U and the Dell PowerEdge R940 is 3U giving HPE a 50-100% density advantage.

    More HPE Features for Gen10

    We wanted to highlight a few new features for Gen10. One example is iLO 5 which is the next generation of HPE’s management interface. We recently reviewed an HPE DL60 Gen9 with the iLO 4.

    VIDEO

    Beyond the iLO management interfaces, HPE’s other tools are upgraded to manage the new Gen10 servers.

    One area that HPE, and other vendors, have been pushing in this product cycle is the idea of security. For example, the company is offering a Silicon Root of Trust based firmware security protection option. If you have the iLO Advanced Premium Security Edition you can have server firmware checked every 24 hours to ensure that the firmware has not been tampered with. If it has been you can rollback to the to last known good state or factory settings after detection of compromised code.

    Final Words

    While the HPE ProLiant product pages have three new Gen10 models and the rest are listed as Gen9, we fully expect this to change over the next few months. Like other major vendors, HPE is releasing its products in phases with this generation. They are certainly great looking machines.

    Microsoft Defines Its Path Forward for On-Premises

    The content below is taken from the original (Microsoft Defines Its Path Forward for On-Premises), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Every time I present or speak at an event, one question that always comes up is “what is the future for on-premises” data centers. It’s a valid question as Microsoft is pushing everything cloud these days and while it may look like they are moving beyond supporting local data centers, that’s not accurate, but they are going to force the modernization of the environment.

    Let’s be clear, the on-premises data center is not going away anytime soon. While Microsoft, Amazon, and Google would like to think that Cloud is the be-all, end-all solution to IT infrastructure, HPE, Dell, and Lenovo can prove that based on their sales of hardware, this isn’t the case.

    That being said, Azure, AWS and similar will reduce the number of new data centers built because as new companies are born, it typically makes sense to build in the cloud than investing in server hardware. Granted, that doesn’t work for every company and there are genuine reasons why the cloud is not for everyone but for most, cloud solutions are viable options going forward.

    There are two things you need to understand about Microsoft’s support for on-premises data centers, the new servicing model for Windows Server and Azure Stack; these two items represent a genuine look at how Microsoft is viewing the evolution of your data center.

    There are two primary servicing channels for Windows Server going forward, Long Term Servicing Channel (LTSC) and Semi-annual Channel (SAC). As both names suggest, long term has a new release every 2-3 years with 5 years mainstream, 5 years extended support and also an option for 6 years of Premium Assurance assistance. The Semi-annual channel ships updates twice a year that will deliver new features to the platform; the slide below provides a good look at the delivery model.

    The twice a year update represents the nimble data center, one that can update frequently to receive the newest features first and the LTSC is for older installations or companies that don’t have the capacity to regularly update their infrastructure.

    For those on the SAC, you will be able to skip one feature update per year which will ease the burden on system admins who don’t want to devote two times a year to deploying upgrades. Each feature update will be supported for 18 months, the same as Windows 10.

    To gain access to the semi-annual channel, you will need to subscribe to Software Assurance or being using Azure. The LTSC will be available through all channels and is what many of us know as the traditional Windows Server model (for example, Windows Server 2016 falls into this branch).

    With this change, Microsoft will be updating its naming convention to one that is similar to how Windows operates. For SAC, this version will be known as Windows Server 1709, 1803, etc. and LTSC will stick to names like Windows Server 2012, and 2016. The goal here is simple; faster updates for its server software that results in the practices used inside the modern data center.

    On the other side of the coin is the hybrid world of the data center; Azure Stack is Microsoft’s big investment for the long term for medium to large environments. The goal here is to extend the benefits of Azure to on-premises operations to those who may not have typically used the cloud platform but need the benefits of its technology.

    But more importantly, by deploying Azure Stack in your data center, this sets up a future transition to the cloud-based solution. With hardware now being offered that is Azure Stack certified, when an on-premises deployment looks to move to the cloud, you can easily port the local solutions to the cloud with minimal impact as Azure Stack is an extension of Azure down to your data center.

    The goal for Microsoft is to bring Azure to everyone and while not all operations can or are ready to move to the cloud, Microsoft is going to keep adding features to Stack and Server that tie in natively to Azure so that when these operations do move to the cloud, jumping to Azure is the natural choice.

    For other services like SharePoint, Microsoft has already said that there will be another release designed for on-premises operations. But, they have said that cloud-based users will get new features first and that’s the overall theme for Microsoft.

    If you want to be on the bleeding edge of Microsoft software and services, the cloud is the way to go, or at a minimum, Azure Stack. For those who need to work on-premises, you will find yourself behind your cloud-based peers but the company is not abandoning these users anytime soon. That being said, the company is planting seeds to entice users to move to Azure and ditch local hardware and those efforts will only intensify with each new iteration of on-premise software they release.

    The post Microsoft Defines Its Path Forward for On-Premises appeared first on Petri.

    Any Alexa device can control your Fire TV

    The content below is taken from the original (Any Alexa device can control your Fire TV), to continue reading please visit the site. Remember to respect the Author & Copyright.

    You’d think that Amazon would have made it possible to control a Fire TV from external Alexa devices as soon as it was an option, but no — you’ve had to use the Fire TV itself if you wanted to play a video using your voice. At last, though, sense has prevailed. Amazon has updated all versions of the Fire TV and Fire TV Stick to add support for voice control from another Alexa-enabled device. If you want to skip to the next episode of a show, you can talk to your Echo or smartphone instead of scrounging for the Fire TV’s remote.

    Does the feature sound familiar? You’re not alone. One of the centerpieces of Google Home is its ability to queue up video on a Cast-enabled TV, and Amazon is effectively matching that feature note for note. Not that we’re complaining. This is arguably one of biggest omissions in the Fire TV’s feature set, and it only makes sense if you live in a household with more than one Amazon device at your beck and call.

    Via: Android Police

    Source: Amazon

    AI lawyer can help you with a thousand different legal issues

    The content below is taken from the original (AI lawyer can help you with a thousand different legal issues), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Over two years ago, Joshua Browder, now a junior at Stanford University, created a chatbot that could contest parking tickets in New York City and London. By June of 2016, DoNotPay had successfully contested 160,000 parking tickets — a 64 percent success rate — and earlier this year, Browder added capabilities to assist asylum seekers in the US, UK and Canada. Now, the bot is able to assist with over 1,000 different legal issues in all 50 states and across the UK.

    To use DoNotPay’s AI-assisted help, you just type your problem into its search bar and links to relevant aid pop up that are specific to your location. After you navigate through different options, a chatbot then asks you questions and puts together a letter or other legal documentation. The bots can help you write letters or fill out forms for issues like maternity leave requests, landlord disputes, insurance claims and harassment.

    Browder hasn’t accepted any outside funding as of yet, but monetization of DoNotPay is in its future. While he hasn’t decided on how that will go, Browder is considering bot sponsorships, like a car dealership sponsoring a parking ticket bot specific to its city, for example.

    The "world’s first robot lawyer," as Browder refers to his service, has beaten an estimated 375,000 parking tickets and saved around $9.3 million in fines. If that success can translate to the 1,000 new legal areas the bot is taking on, DoNotPay can become a seriously useful free legal aid.

    Via: The Verge

    Source: DoNotPay (1), (2)

    oBike arrives in London with its dockless take on Boris bikes

    The content below is taken from the original (oBike arrives in London with its dockless take on Boris bikes), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Already this year we’ve seen two Chinese companies that run novel bike rental schemes expand into the UK, and now Singaporean firm oBike is throwing its chips into the pot, too. The startup has this week put 400 of its two-wheelers to work in the London Borough of Tower Hamlets, despite the capital being home to over 11,000 for-hire ‘Boris bikes.’ Unlike these, though, oBikes don’t require docking. Through the company’s mobile app, you locate the nearest available pushbike on a map, unlock it by scanning its unique QR code, then leave it wherever you want when you’re done.

    The app also handles the payment side of things — 50p per half hour — and includes a credit system that gives users free ride time for reporting damaged and illegally parked bikes. The company tells Wired Tower Hamlets was a good place to start as it believes it to be an underserved area, and that it’s going to add hundreds of bikes to grow the scheme every day throughout July. According to oBike, it’s in ongoing discussions with local councils, which probably means it’s trying to persuade them there’s space for another player, considering the large number of ‘Boris bikes’ already in circulation.

    Even in the much less saturated city of Cambridge, Chinese company Ofo had to scale back its trial, which began in April, to only a handful of bikes after the council became concerned they would clutter pavements. Mobike, which runs another identical app-based rental scheme, has had problems of its own after putting 1,000 bikes on the street of Manchester last month. It’s far from a widespread issue, but some bikes have been the victim of run-of-the-mill vandalism, while others have simply been stolen after having their locks hacked off.

    Via: Wired

    Source: oBike

    Bitdefender Home Scanner scans your Home Network for vulnerabilities

    The content below is taken from the original (Bitdefender Home Scanner scans your Home Network for vulnerabilities), to continue reading please visit the site. Remember to respect the Author & Copyright.

    The internet is now not just limited to mobile phones and computers. With all these smart devices showing up, your home network has grown drastically. We have mobiles, computers, smart TVs, surveillance devices, automation devices and a lot of other devices connected to the same network. But have you ever given a thought to vulnerabilities of each of them? Putting up security on work networks was always a required thing, but security on home networks is also becoming a priority these days. To help you out with this we have Bitdefender Home Scanner.

    Bitdefender Home Scanner os a free & fast Wi-Fi scanner for your home network. It looks for vulnerable devices and passwords, and offers detailed security recommendations for your home network.

    Bitdefender Home Scanner

    Pretty much like its name, Bitdefender Home Scanner scans your home for all kinds of network vulnerabilities. Designed by network and security experts, this tool can take out some potentially harmful security flaws and weaknesses of your network. Moreover, it provides a detailed description and steps to fix a flaw and make your network secure and safe.

    So, what does it do?

    The first step is knowing your network, so the program will ask you whether the WiFi you are connected to is your home network? Using this tool on public networks is not recommended at all. Once you’ve setup your home network, the program starts scanning. And in the entire process, Bitdefender Home Scanner scans for all the connected devices. And then proceeds to scan them individually for vulnerabilities.

    In the process, the tool will scan all the open ports and also look for poorly encrypted connections. All in all, Home Scanner scans for insecure connections, weak credentials and any hidden backdoors in your network.

    Using this tool is pretty simple and straightforward. Just download, install and run it. You will need to create a Bitdefender account before using this tool, or you might Sign In if you already have one. Once registered, you can select your home Wi-Fi network and proceed to scan. Scanning might take a little time depending on the number of devices connected your network.

    Bitdefender Home Scanner

    Once the scan has been completed, a list of connected devices with their issues will be displayed. If all the devices are clean and safe, they will be marked with a green flag. And vulnerable devices will be marked with a red flag, and you can view issues in detail by click opening one single device.

    Other simple details like MAC Address, IP Address, device manufacturer and device type can also be viewed from the same window.

    Another great feature of this tool is that once you’ve configured up your home network, you will be notified whenever a new device connects. So that you can scan that device right away. Also, it is recommended to scan devices frequently as new vulnerabilities are discovered every day, and you need to be sure about them.

    You can also change your home network to something else by going into the ‘My Account’ page. A list will be displayed from where you can choose your home network and make this tool scan that network.

    Bitdefender Home Scanner is a great tool to have in addition to your antivirus/antimalware program. The tool makes sure you are always protected and none of your connected devices are vulnerable. It is a perfect tool if you use a lot of smart devices like TVs, surveillance systems, and automation devices. Click here to download Bitdefender Home Scanner.

    Read next: Bitdefender BOX will protect IoT Devices from Malware and Hacking.

    OpenStack Developer Mailing List Digest July 1-8

    The content below is taken from the original (OpenStack Developer Mailing List Digest July 1-8), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Important Dates

    • July 14, 2017 23:59 OpenStack Summit Sydney Call for Presentations closes 1.
    • Around R-3 and R-4 (July 31 – August 11, 2017) PTL elections 2
    • All 3

    Summaries

    • TC status update by Thierry 4
    • API Working Group new 5
    • Nova placement/resource providers update 6

    SuccessBot Says

    • pabelanger on openstack-infra 7: opensuse-422-infracloud-chocolate-8977043 launched by nodepool
    • clark on openstack-infra 8: infra added citycloud to the pool of test nodes.
    • fungi on openstack-infra 9: OpenStack general mailing list archives from Launchpad (July 2010 to July 2013) have been imported into the current general archive on lists.openstack.org.
    • adreaf on openstack-qa: 10 Tempest ssh validation running by default in the gate on master.
    • All 11

    Most Supported Goals And Improving Goal Completion

    • Community wide goals discussions started at the OpenStack Forum, then the mailing list and IRC for those that couldn’t be at the Forum.
      • These discussions help the TC make decisions on which goals will be to a release.
    • Potential goals:
      • Split Tempest plugins into separate repos/projects 12
      • Move policy and docs into code 13
    • Goals in Pike haven’t been really reached.
    • An idea from the meeting to address this is creating a role called “Champions” who are drum beaters to get a goal done, by helping projects with tracking status, and sometime doing code patches.
    • Interested volunteers who have a good understanding of their selected goal and its implementation to be a trusted person.
    • From the the discussion in thread, it seems we’re mostly in agreement with the Champion idea.
      • We have a volunteer for splitting out tempest plugins into repos/projects.
    • Full thread 14

     

    1. http://bit.ly/2ugoHjo/
    2. http://bit.ly/2uPd2Vxl
    3. http://bit.ly/1xKkuAH/
    4. http://bit.ly/2uPd3c38
    5. http://bit.ly/2ugpErZl
    6. http://bit.ly/2uP7Tgq8
    7. http://bit.ly/2scetfCl
    8. http://bit.ly/2scetfCl
    9. http://bit.ly/2rVtVwZl
    10. http://bit.ly/2rVtVwZl
    11. http://bit.ly/2rCKikRs
    12. http://bit.ly/2uPd3c38
    13. http://bit.ly/2uP7xq2l
    14. http://bit.ly/2ugkn3H8

    #openstack #openstack-dev-digest

    Gobs of free Microsoft eBooks.

    The content below is taken from the original (Gobs of free Microsoft eBooks.), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Here ya go!

    http://bit.ly/2ufNRyy

    New infographic: Cloud computing in 2017

    The content below is taken from the original (New infographic: Cloud computing in 2017), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Cloud Computing Market - Infographic

    With 83% of businesses ranking cloud skills as critical for digital transformation in 2017, it’s great news for anyone with cloud architecting experience, and for those considering a career in cloud computing. In our new infographic, we compiled some of the latest industry research to look at the world of cloud computing in 2017.
    The cloud will continue to disrupt traditional IT models as the growing amount of data generated by people, machines, and things will increasingly be handled in the cloud. This is highlighted in both the shift to IT spending away from traditional on-premise hardware, and the increased adoption of public, private, and hybrid cloud models.
    As the cloud adoption increases, companies are using it to achieve greater scalability, higher performance, and faster time to market. As a result, skills for architecting, deploying, and securing the cloud will continue to be essential. And because companies are embracing multiple cloud models and multiple providers, those working in the cloud will need versatile skills that cover different platforms and services to help companies leverage the cloud for continued benefits and competitive advantage.
    Learn more about the world of cloud computing in 2017 in our infographic.
    Cloud Academy Infographic Web

    References: Microsoft Cloud Skills Report | Cloud Computing Top Markets Report | U.S Talent Shortage Survey | Intelligentcio.comForbes.com | Rightscale.com | Tech.web | Computerweekly.com

    Azure Site Recovery now supports large disk sizes in Azure

    The content below is taken from the original (Azure Site Recovery now supports large disk sizes in Azure), to continue reading please visit the site. Remember to respect the Author & Copyright.

    Following the recent general availability of large disk sizes in Azure, we are excited to announce that Azure Site Recovery (ASR) now supports the disaster recovery and migration of on-premises virtual machines and physical servers with disk sizes of up to 4095 GB to Azure.

    Many on-premises virtual machines that are part of the Database tier and file servers use disks with sizes greater than 1 TB. Support for protecting these virtual machines with large disk sizes has consistently featured as a top ask from both our customers and partners. With this enhancement, ASR now provides you the ability to recover or migrate these workloads to Azure.

    These large disk sizes are available on both standard and premium storage. In standard storage, two new disk sizes, S40 (2TB) and S50 (4TB) are available for managed and unmanaged disks. For workloads that consistently require high IOPS and throughput, two new disk sizes, P40 (2TB) and P50 (4TB) are available in premium storage, again for both managed and unmanaged disks. Depending upon your application requirements, you can choose to replicate your virtual machines to standard or premium storage with ASR. More details on the configuration, region availability and pricing of large disks is available in this storage documentation.

    To show you how Azure Site Recovery supports large disk sizes, I protected the Database tier VM of a SharePoint farm. You can see that this VM has data disks which are greater than 1 TB.

    Large disks

    Pre-requisite step for existing ASR users:

    Before you start protecting virtual machines/physical servers with greater than 1 TB disks, you need to install the latest update on your existing on-premises ASR infrastructure. This is a mandatory step for existing ASR users.

    For VMware environments/physical servers, install the latest update on the Configuration server, additional process servers, additional master target servers and agents.

    For Hyper-V environments managed by System Center VMM, install the latest Microsoft Azure Site Recovery Provider update on the on-premises VMM server.

    For Hyper-V environments not managed by System Center VMM, install the latest Microsoft Azure Site Recovery Provider on each node of the Hyper-V servers that are registered with Azure Site Recovery.

    I would like to call out that support for Disaster Recovery of IaaS machines in Azure with large disk sizes is not available currently. This support would be made available soon.

    Start using Azure Site Recovery today. Visit the Azure Site Recovery forum on MSDN for additional information and to engage with other customers. You can also use the ASR User Voice to let us know what features you want us to enable next.

    $99 buys you a useful, but plain, Android Wear watch

    The content below is taken from the original ($99 buys you a useful, but plain, Android Wear watch), to continue reading please visit the site. Remember to respect the Author & Copyright.

    When you think about all the Android Wear watches on the market, you probably recall LG, Huawei, Michael Kors or Tag Heuer. Google typically partners with heavyweights in tech and fashion. So it’s intriguing to see a small, obscure startup like Mobvoi offer its own Android Wear watch. What’s most interesting, though, is the Ticwatch E’s price tag: just $99.

    Of course that’s only if you buy the Ticwatch E on the company’s Kickstarter project before it becomes more widely available. When that happens, it will cost $159 and you’ll be able to get it on Amazon or Mobvoi’s website. A higher-end version called Ticwatch S will cost $119 on Kickstarter, and $199 at retail. Even after the early bird period, though, that’s still the cheapest Android Wear 2.0 watch around right now.

    What you get for those prices is surprising; Mobvoi didn’t cut corners. Both models sport a bright round 1.4-inch screen with a 400×400 resolution, a heart rate monitor and a GPS sensor — features that some more-expensive devices lack. Our demo unit of the Ticwatch E was responsive, and Google Assistant was actually faster than on competing devices I’ve tested. That’s particularly impressive given Mobvoi uses a Mediatek processor here instead of a higher-end Qualcomm option. During our preview, Google Maps was also quick to locate us, despite being indoors.

    With their bright colors (white, black and lime are available) and silicone rubber straps, the Ticwatch S and E both look and feel cheaper than the competition. While you can swap out the standard 22mm band on the S version to make it prettier, you’re stuck with the default non-removeable strap on the E flavor. That’s because the latter’s GPS antenna is built into the band. Mobvoi figures the E model is more appropriate for a sportier crowd, so it made the entire device lighter. It also designed the E’s strap to be "breathable" by carving out a hollow underneath to avoid sweat buildup.

    Mobvoi said it included only "essential hardware" that it believes its users would need, which explains why neither version has a cellular radio. That omission not only keeps the watches slim and lowers costs, but should also allow for longer-lasting batteries than the competition. Depending on how you use it, the company says each device should last between 1.5 to 2 days.

    Mobvoi also says it will include five of its own apps on the watches, such as Tic Fitness, Health and Music Player, which lets you store and play music on your watch. These are carried over from the core app suite on the Ticwatch 2, and can’t be uninstalled. The inclusion is meant to please fans of the company’s existing smartwatches, which run Mobvoi’s own OS. Up to 15 other apps from that system will be available for download from the store, too. Since the apps weren’t on our demo unit, we couldn’t tell if they would actually be useful or feel more like bloatware.

    Ultimately, we can’t determine whether the Ticwatch S and E will hold up after long-term use based on our brief preview. People who are style-conscious probably won’t appreciate the watches’ distinctly plastic, toy-like appearance. But those who could care less about looks and are more interested in trying out Android Wear 2.0 on a budget should hit up the team’s Kickstarter page before they’re sold out.

    Monitor Changes and Auto-Enable Logging in AWS CloudTrail

    The content below is taken from the original (Monitor Changes and Auto-Enable Logging in AWS CloudTrail), to continue reading please visit the site. Remember to respect the Author & Copyright.

    AWS CloudTrail is a service that enables governance, compliance, operational auditing, and risk auditing of your AWS account. Hence, it’s crucial to monitor any changes to CloudTrail and make sure that logging is always enabled.

    With CloudTrail, you can log, continuously monitor, and retain events related to API calls across your AWS infrastructure. CloudTrail provides a history of API calls for your account, including API calls made through the console, AWS SDKs, command line tools, and other AWS services. This history simplifies security analysis, resource change tracking, and troubleshooting.

    In this post, I describe a solution to notify on changes to CloudTrail and re-enable logging whenever logging is disabled.

    Change monitoring and notification

    For this walkthrough, you use an Amazon CloudWatch Events rule to monitor changes to a CloudTrail trail. An AWS Lambda function set as a target for this rule contains the logic to detect changes to the trail and publish a message to an Amazon SNS notification. The diagram below depicts the workflow.

     

     

    1. An IAM user makes changes to a CloudTrail trail.
    2. That change event gets detected by a CloudWatch Events rule.
    3. The rule triggers a Lambda function.
    4. The function publishes the change event to an SNS topic.
    5. The SNS topic sends the email to its subscribers.
    6. If the change event was to disable logging, the function re-enables logging on that trail.

    The CloudWatch Events rule detects the following CloudTrail operational events:

    • “StopLogging”
    • “StartLogging”
    • “UpdateTrail”
    • “DeleteTrail”
    • “CreateTrail”
    • “RemoveTags”
    • “AddTags”
    • “PutEventSelectors”

    After a “StopLogging” event is detected, the Lambda function re-enables logging for that trail. This generates a “StartLogging” event that again sends an SNS notification.

     

    Walkthrough

    Now, I walk you through creating an SNS topic and subscription, Lambda function, and CloudWatch Events rule. To deploy this solution, download the CloudTrailMonitor.json AWS CloudFormation template. The README document provides instructions to deploy the stack.

    Create the SNS topic and subscription

    In the SNS console, choose Create topic and enter appropriate values for Topic name (such as CloudTrailAlert) and Display name (CT-Alert). Choose Create topic. Select the topic and view the details.

    Next, choose Create subscription.

    For Protocol, choose Email-JSON. Enter the email address where notifications should be sent and choose Create subscription.

    An email is sent to confirm the SNS topic subscription. In the email, open the SubscribeURL link to complete the subscription. Note the SNS topic ARN, as it is used later by the Lambda function.

    For more information, see Create a Topic in the Amazon SNS Developer Guide.

     

    Create the Lambda function

    In the Lambda console, choose Functions, Create a Lambda function. Choose Blank Function and on the Configure trigger page, choose Next.

    On the next page, enter the following values:

    • Name: An appropriate name for the Lambda function
    • Runtime: Python 2.7
    • Code entry type: Upload a ZIP file
    • Function package: Upload the Cloudtraillambdamonitor.zip file
    • Environment variables:
      • Key: SNSARN
      • Value: The SNS topic ARN noted earlier
    • Role: Create a custom role (takes you to another page). Call the role CloudTrailLambda.

    For the policy document, enter the following policy:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "LambdaCloudtraiMonitor",
                "Effect": "Allow",
                "Action": [
                    "cloudtrail:DescribeTrails",
                    "cloudtrail:GetTrailStatus",
                    "cloudtrail:StartLogging"
                ],
                "Resource": [
                    "arn:aws:cloudtrail:*:<AWS-ACCOUNT-ID>:trail/*"
                ]
            },
            {
                "Sid": "CloudWacthLogspermissions",
                "Effect": "Allow",
                "Action": [
                    "logs:CreateLogGroup",
                    "logs:CreateLogStream",
                    "logs:PutLogEvents"
                ],
                "Resource": [
                    "arn:aws:logs:*:*:*"
                ]
            },
            {
                "Sid": "SNSPublishpermissions",
                "Effect": "Allow",
                "Action": [
                    "sns:Publish"
                ],
                "Resource": [
                    "arn:aws:sns:*:*:*"
                ]
            }
        ]
    }
    

    On the Configure function page, choose Next. Review the configuration settings before choosing Create function.

    For more information, see Step 2.1: Create a Hello World Lambda Function in the AWS Lambda Developer Guide.

     

    Create the CloudWatch Events rule

    In the CloudWatch Events console, choose Create rule. Enter the following values:

    • Service Name:  CloudTrail
    • Event Type:  AWS API call via CloudTrail
    • Specific Operations:  StopLogging, StartLogging, UpdateTrail, DeleteTrail, CreateTrail, RemoveTags, AddTags, PutEventSelectors

    For Targets, select the name of the Lambda function created earlier and choose Configure details. On next page, enter an appropriate name and description for this rule. For State, select Enabled. Choose Create rule.

    For more information, see Tutorial: Schedule Lambda Functions Using CloudWatch Events in the Amazon CloudWatch Events User Guide.

     

    Validate monitoring

    To validate if the solution is working properly, make a change to CloudTrail and see if you get the notification about this change. The following are some sample emails for when a change in CloudTrail was detected. In this case, logging was disabled and re-enabled automatically.

     

    Summary

    In this post, I explained how to create a solution with CloudWatch Events, Lambda, and SNS to notify you about changes to CloudTrail trails, and to re-enable logging automatically whenever logging is disabled. If you can’t guarantee that your compliance logging is fully managed and automatic, your organizational governance or auditing may be at risk.

    For more information, I recommend the following whitepapers:

    About the Author

    Sudhanshu Malhotra is a Solutions Architect at AWS Professional Services. Sudhanshu enjoys working with our customers and helping them deliver complex solutions in AWS in the area of DevOps, Infrastructure-as-Code and Config Management. In his spare time, Sudhanshu enjoys spending time with his family, hiking and tinkering with cars.

    Mappy days! Ordnance Survey offers up free map of UK greenery

    The content below is taken from the original (Mappy days! Ordnance Survey offers up free map of UK greenery), to continue reading please visit the site. Remember to respect the Author & Copyright.

    The Ordnance Survey has launched a free online map of Britain’s green spaces with an open dataset for developers to get their hands on.

    The mapping agency’s latest offering pulls together geospatial data to create a map of concrete-free areas across the country – everything from your local park to an allotment.

    The work builds on the Scottish greenspace map, which the Ordnance Survey says was the first of its kind in the world when it was released in 2011. The latest map covers all of England and Wales, too.

    Using data from the Ordnance Survey as well as NGOs and other government agencies, the map pinpoints the location and extent of recreational areas and leisure facilities, colour coding them based on their use (bright green for public parks, brown for allotments).

    For the bigger sites, it lists the location of access points to the green spaces as a helpful green dot.

    The map is the latest in a line of releases from the Survey that aim to make its detailed maps more accessible for a digital age.

    Last year, it launched a smartphone app, giving would-be walkers more textures and information that your common (or garden) map app might.

    The aim is to get people off the couch and outdoors, and science minister Jo Johnson said that the latest map would “make it easier for people across the country to access greenspaces and lead healthier lives”.

    The Survey is also releasing a freely available dataset, OS Open Greenspace, which will become part of the Ordnance Survey’s Open Data portfolio.

    The dataset will be updated every six months, and the product can be downloaded in 100km sq tiles to allow people to easily locate and use just the area of interest. The technical specifications for the dataset can be found here (PDF).

    CEO Nigel Clifford said he was “excited to see how people experiment and work with the data” and looks forward “to seeing new products and services to help encourage an active Great Britain”.

    There’s also a public sector version of the greenspace map, called the MasterMap, which contains the location of all publicly accessible and non-accessible green spaces.

    The Survey said that giving the public sector accurate and up-to-date geospatial data would improve planning, analysis and decision-making.

    “It is hoped the dataset will prove instrumental in helping the public sector create and manage health and wellbeing strategies, active travel plans and various environmental initiatives that include air quality, biodiversity, housing regeneration and flood resilience,” the agency said. ®