MDT (6.3.8450.0)

The content below is taken from the original ( MDT (6.3.8450.0)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Microsoft Deployment Toolkit (MDT) is a free computer program from Microsoft that assists with the deployment of Microsoft Windows and Office

Audi’s latest models add Amazon Music to the dashboard

The content below is taken from the original ( Audi’s latest models add Amazon Music to the dashboard), to continue reading please visit the site. Remember to respect the Author & Copyright.

If you're an Apple CarPlay or Android Auto user, you've no shortage of music streaming services baked into your dashboard. But, if you're relying on your vehicle's default control panel the choices start to dwindle. While, automakers like Ford have s…

megui (1.0.2774)

The content below is taken from the original ( megui (1.0.2774)), to continue reading please visit the site. Remember to respect the Author & Copyright.

Portable video converter front-end for many free command line tools

PowerShell for SharePoint Online Usage Scenarios

The content below is taken from the original ( PowerShell for SharePoint Online Usage Scenarios), to continue reading please visit the site. Remember to respect the Author & Copyright.

PowerShell is not only a powerful tool to administer and manage a SharePoint Online (SPO) tenant but also for common activities as an Office 365 Administrator or an SPO. In this article, I will cover some of the most common PowerShell for SharePoint Online usage scenarios as described in Figure 1.

 

 

Figure 1– Common PowerShell for SPO Usage Scenarios.

Service Configuration and Administration Scenarios

Under these scenarios, we have any action that implies to apply specific SPO settings available through SPO PowerShell cmdlets and/or SPO APIs. Some examples of typical operations that fall under these scenarios are the following ones:

  • While it’s true that OneDrive for Business (ODFB) and SPO provides support for hashtag and percent symbols in files names and folder names, you need to explicitly enable in your tenants by using PowerShell. Note that there is not a way to enable the support for these characters in the SPO Administration UI. To enable the support for these symbols in ODF and SPO, you must use Set-SPOTenant cmdlet as follows:
Set-SPOTenant -SpecialCharactersStateInFileFolderNames Allowed
$O365SPOTenant=Get-SPOTenant
$O365SPOTenant.SpecialCharactersStateInFileFolderNames

  • Configuring sharing capability at the tenant or site collection level is very important when we want to share an Office 365 Group site with external users without adding them as a guest in the Group. To enable external users sharing in an Office 365 Group site, we only need to use Set-SPOSite cmdlet as detailed below:
$sO365GroupSite="https://<Your_Group_Site_Url>"
Set-SPOSite -Identity $sO365GroupSite -SharingCapability ExternalUserSharingOnly

Auditing Operations and Reporting scenarios

On the one hand, Auditing Operations scenario is intended to provide information about what is happening at any logical containers in an SPO tenant (Site Collections, Sites, Lists, Document Libraries, etc) in regards to common operations, such as creating or updating content, making updates in SPO security model and so on. On the other hand, reporting generation scenario is about activities taking place in SPO that are also covered in this PowerShell usage scenario. Some good examples of these scenarios:

  • Get information about the SPO tenant logical and information architecture in terms deployed Site Collections, Sites, Lists and document libraries.
  • Get detailed information about security settings at different levels (Site Collections, Sites, Lists and document libraries, list elements and documents) such as:
    • SharePoint security groups in use
    • Users/Group members of each SharePoint security group

 

 

For instance, if you are asked to provide a report with all the members of each SharePoint Security Group configured on an SPO site, you only need to execute the following PowerShell script that uses SPO Get-SPOSiteGroup and Get-SPOUser cmdlets:

$spoSharePointGroups=Get-SPOSiteGroup -Site $sSiteUrl
foreach($spoSharePointGroup in $spoSharePointGroups){ 
Write-Host "Users in " $spoSharePointGroup.Title ":"
$spoUsers=Get-SPOUser -Site $sSiteUrl -Group $spoSharePointGroup.Title
Write-Host “ -> “ $spoUsers.LoginName
Write-Host “--------------------------------“ -ForegroundColor Green
}

  • Get detailed information about a SPO tenant:
    • Storage used in each site collection in the tenant
    • Changes happening in the tenant

For instance, to query the Office 365 audit log and get information about file activities happening in all the sites in the tenant simply execute the following PowerShell script:

$PSSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://bit.ly/2BG1vhK -Credential $Cred -Authentication Basic -AllowRedirection
Import-PSSession $PSSession
Search-UnifiedAuditLog -StartDate 12/1/2017 -EndDate 12/7/2017 -RecordType SharePointFileOperation -Operations FileAccessed -SessionId "Docs_SharepointViews"-SessionCommand ReturnNextPreviewPage

SPO Solutions Deployment Scenario

PowerShell is a common vehicle to deploy solutions on top of SPO that also includes any kind of customization to new or existing SPO Sites. Under this scenario, we can find a wide range of possibilities:

  • Apply a common look and feel (for instance a theme) to all the sites defined under a specific site collection.
  • Provision the full information architecture required for an SPO solution being developed: Site Collections, Sites, Site Columns, Content Types, etc.
  • Deploy Apps or WebParts to new or existing SPO Sites.
  • Configure security model for the solution (SharePoint security groups, permissions level, permissions inheritance mechanism, etc).

As an example, you can create a new SPO list in an SPO site using the following PowerShell script that makes use of the client-side object model (CSOM) SPO API:

#Adding the Client OM Assemblies 
$sCSOMRuntimePath=$sCSOMPath + "\Microsoft.SharePoint.Client.Runtime.dll" 
$sCSOMPath=$sCSOMPath + "\Microsoft.SharePoint.Client.dll" 
Add-Type -Path $sCSOMPath 
Add-Type -Path $sCSOMRuntimePath 
#SPO Client Object Model Context
$spoCtx = New-Object Microsoft.SharePoint.Client.ClientContext($sSiteUrl)
$spoCredentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($sUserName, $sPassword) 
$spoCtx.Credentials = $spoCredentials 
#Creating the List
$spoWeb=$spoCtx.Web
$spoListCreationInformation=New-Object Microsoft.SharePoint.Client.ListCreationInformation
$spoListCreationInformation.Title=$sListName
$spoListCreationInformation.TemplateType=[int][Microsoft.SharePoint.Client.ListTemplatetype]::GenericList
$spoList=$spoWeb.Lists.Add($spoListCreationInformation)
$spoList.Description=$sListDescription
$spoCtx.ExecuteQuery()
$spoCtx.Dispose()

 

Information Loading and Migration scenarios

Finally, last scenarios cover situations where it’s required either to upload data to SPO sites or to move/migrate information to SPO sites. Make note that this information could come from another SPO Site or event SPO tenant, from a SharePoint OnPremises farm or even from a corporate file server. Some examples of situations that are under these scenarios are the following:

  • Move documents from Local File Systems, Other Cloud Storage Services (DropBox, Box, GDrive), SharePoint On-Premises to SPO, and OneDrive For Business.
  • Load information in SPO coming from different information sources (Local files, SQL Database, non-SQL database, etc).

For instance, the following PowerShell script allows to upload information from a CSV file to an SPO list using SPO CSOM API:

$spoCtx = New-Object Microsoft.SharePoint.Client.ClientContext($sSiteUrl)
$spoCredentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($sUserName, $sPassword) 
$spoCtx.Credentials = $spoCredentials 
#Adding Data to an existing list
$spoList = $spoCtx.Web.Lists.GetByTitle($sListName)
$spoCtx.Load($spoList)
foreach ($sItem in $tblItems) {
Write-Host "Adding " $sItem.SPOListItem " to $sListName"
$spoListItemCreationInformation = New-Object Microsoft.SharePoint.Client.ListItemCreationInformation
$spoListItem=$spoList.AddItem($spoListItemCreationInformation)
$spoListItem["Title"]=$sItem.SPOListItem.ToString()
$spoListItem.Update()
$spoCtx.ExecuteQuery() 
} 
$spoCtx.Dispose()

 

Sponsored

 

Conclusions

PowerShell for SPO is a tool not only for platform administration and configuration tasks but also for doing many other common activities as an SPO Administrator (or an Office 365 one) can require: Auditing Operations, Reporting, SPO Solutions Deployment, Data Loading, and Migration.

The post PowerShell for SharePoint Online Usage Scenarios appeared first on Petri.

Neural Network Learns SDR Ham Radio

The content below is taken from the original ( Neural Network Learns SDR Ham Radio), to continue reading please visit the site. Remember to respect the Author & Copyright.

Identifying ham radio signals used to be easy. Beeps were Morse code, voice was AM unless it sounded like Donald Duck in which case it was sideband. But there are dozens of modes in common use now including TV, digital data, digital voice, FM, and more coming on line every day. [Randaller] used CUDA to build a neural network that could interface with an RTL-SDR dongle and can classify the signals it hears. Since it is a neural network, it isn’t so much programmed to do it as it is trained. The proof of concept has training to distinguish FM, SECAM, and tetra. However, you can train it to recognize other modulation schemes if you want to invest the time into it.

It isn’t that big of a task to identify signals using your built-in neural network. However, this is a great example of a practical neural net and it does open the door to other possibilities. For example, automated monitoring of multiple channels would need something like this.

One interesting tidbit is that the neural network doesn’t really know what it is learning, so input samples could be IQ samples, audio, or even waterfall graphics. You just have to use the same input to train that you want to use during operation. In fact, the code apparently started out as an image classification network from a course by Stanford.

If this gives you the urge to go out and buy an RTL-SDR dongle, you might want to look at some reviews. What else could you do with an intelligent radio? We’ve already seen a different kind of neural network decode Enigma traffic.

Filed under: Wireless Hacks

Automating IAM Roles For Cross-Account Access Series Overview

The content below is taken from the original ( Automating IAM Roles For Cross-Account Access Series Overview), to continue reading please visit the site. Remember to respect the Author & Copyright.

The AWS Partner Network Blog has recently published a series describing a method to automate the creation of an IAM role for cross-account access, and how to collect the information needed for a partner to assume the role after creation. This post gives readers an overview of the series, summarizing each of the individual posts with links back to the original content for further reading.

As a reminder, cross-account IAM roles allow customers to grant access to resources within their account to a partner or other third parties while enabling the customers to maintain their security posture. Cross-account roles allow the customer to delegate access without the need to distribute key material, and without the burden on the third party to safely handle key material after receipt.

The blog series kicked off with a post that explained how to create a custom launch stack URL for AWS CloudFormation. The URL will take users directly to the CloudFormation Create Stack wizard, with values for the Amazon S3 template location, stack name, and default parameters already populated. The launch stack URL eliminates the need to exchange template files with the customer, and ensures that the customer is using the proper template with the correct values.

The second post describes how to use an AWS Lambda function to populate a AWS CloudFormation template with uniquely generated values. The series example uses an External ID, an ID that is unique for each end user. This value needs to be set within the CloudFormation template. When triggered, the Lambda function pulls down the default template, inserts a generated unique External ID into the template, and uploads the customized template to an S3 bucket. Once the template upload is complete, the end user is presented with a custom launch stack URL containing the unique template Amazon S3 location. Finally, we demonstrated how to use the Launch Stack icon to make the URL more visible to users.

The third post details how to reliably return the Amazon Resource Name (ARN) of the cross-account role created by AWS CloudFormation to a third party. As a reminder, the third party must use the ARN, as well as the ExternalID, when assuming the role in the end user’s account. The post demonstrates a CloudFormation custom resource designed to send the ARN back to the third-party account, which consumes the ARN and stores it for later use.

The final post of the series brings the details of the previous three blog posts together into one cohesive solution. It shows how to implement the automation of cross-account role creation for customer onboarding using the techniques described in each post in a completed workflow. The workflow creates a smoother onboarding experience for the customer while creating a secure way for the third party account to create resources within the customer account.

We hope that the blog series can help you and your company improve your customer on-boarding experience. You can avoid the sharing of sensitive keys and the error-prone approach of requiring your customers to cut and paste information in their account and your on-boarding portal.


About the Author

Erin McGill is a Solutions Architect in the AWS Partner Program with a focus on DevOps and automation tooling.

10 RISC OS gift ideas for Christmas

The content below is taken from the original ( 10 RISC OS gift ideas for Christmas), to continue reading please visit the site. Remember to respect the Author & Copyright.

Here are some thoughts for some RISC OS gifts to treat yourself or your RISC OS loved ones from 2017.

1. The latest 5.23 ROM was released. Get a copy the software on Get the latest release on SD card or combined with lots of great software.
2. The BBC BASIC manual, now updated after 25 years.
3. The latest DDE release, complete with a wealth of electronic reference materials.
4. The latest edition of !Artworks, now at 2.X3.
5. Contribute to a bounty to help this happen in RISC OS releases.
6. Relax with some new Arcade games from Amcog games.
7. Organizer 2.28 gives you the ultimate Calendar and Organiser on your RISC OS machine.
8. Get your Fonts back into order with Font Directory Pro.
9. Keep using your old software on new hardware with Aemulor.
10. A RaspberryPi is stocking sized with a price to match and opens up the RISC OS and Linux software world.

What would you like to see under the tree?

No comments in forum

RIP, AOL Instant Messenger

The content below is taken from the original ( RIP, AOL Instant Messenger), to continue reading please visit the site. Remember to respect the Author & Copyright.

We knew this day would come. One of the major parts of our formative years on the worldwide web — we called it that back in the day — will cease to be. AOL Instant Messenger (AIM) came to a close a few hours ago. While we've already eulogized it, i…

ADSL Robustness Verified By Running Over Wet String

The content below is taken from the original ( ADSL Robustness Verified By Running Over Wet String), to continue reading please visit the site. Remember to respect the Author & Copyright.

A core part of the hacker mentality is the desire to test limits: trying out ideas to see if something interesting, informative, and/or entertaining comes out of it. Some employees of Andrews & Arnold (a UK network provider) applied this mentality towards connecting their ADSL test equipment to some unlikely materials. The verdict of experiment: yes, ADSL works over wet string.

ADSL itself is something of an ingenious hack, carrying data over decades-old telephone wires designed only for voice. ADSL accomplished this in part through robust error correction measures keeping the bytes flowing through lines that were not originally designed for ADSL frequencies. The flow of bytes may slow over bad lines, but they will keep moving.

How bad? In this case, a pair of strings dampened with salty water. But there are limits: the same type of string dampened with just plain water was not enough to carry ADSL.

The pictures of the test setup also spoke volumes. They ran the wet string across a space that looked much like every hacker workspace, salt water dripping on the industrial carpet. Experimenting and learning right where you are, using what you have on hand, are hallmarks of hacker resourcefulness. Fancy laboratory not required.

Thanks to [chris] and [Spencer] for the tips.

Filed under: Network Hacks

Alibaba Cloud Becomes the First Cloud Computing Company to Obtain C5 Attestation with Additional Requirements

The content below is taken from the original ( Alibaba Cloud Becomes the First Cloud Computing Company to Obtain C5 Attestation with Additional Requirements), to continue reading please visit the site. Remember to respect the Author & Copyright.

Alibaba Cloud , the cloud computing arm of the Alibaba Group, announced today that it had completed its assessment for the Cloud Computing Compliance Read more at VMblog.com.

Cloud storage now more affordable: Announcing general availability of Azure Archive Storage

The content below is taken from the original ( Cloud storage now more affordable: Announcing general availability of Azure Archive Storage), to continue reading please visit the site. Remember to respect the Author & Copyright.

Today we’re excited to announce the general availability of Archive Blob Storage starting at an industry leading price of $0.002 per gigabyte per month! Last year, we launched Cool Blob Storage to help customers reduce storage costs by tiering their infrequently accessed data to the Cool tier. Organizations can now reduce their storage costs even further by storing their rarely accessed data in the Archive tier. Furthermore, we’re also excited to announce the general availability of Blob-Level Tiering, which enables customers to optimize storage costs by easily managing the lifecycle of their data across these tiers at the object level.

From startups to large organizations, our customers in every industry have experienced exponential growth of their data. A significant amount of this data is rarely accessed but must be stored for a long period of time to meet either business continuity or compliance requirements; think employee data, medical records, customer information, financial records, backups, etc. Additionally, recent and coming advances in artificial intelligence and data analytics are unlocking value from data that might have previously been discarded. Customers in many industries want to keep more of these data sets for a longer period but need a scalable and cost-effective solution to do so.

“We have been working with the Azure team to preview Archive Blob Storage for our cloud archiving service for several months now.  I love how easy it is to change the storage tier on an existing object via a single API. This allows us to build Information Lifecycle Management into our application logic directly and use Archive Blob Storage to significantly decrease our total Azure Storage costs.”

-Tom Inglis, Director of Enabling Solutions at BP BPP_Rlbg

Azure Archive Blob Storage

Azure Archive Blob storage is designed to provide organizations with a low cost means of delivering durable, highly available, secure cloud storage for rarely accessed data with flexible latency requirements (on the order of hours). See Azure Blob Storage: Hot, cool, and archive tiers to learn more.

Archive Storage characteristics include:

  • Cost-effectiveness: Archive access tier is our lowest priced storage offering for long-term storage which is rarely accessed. Preview pricing will continue through January 2018. For new pricing effective February 1, 2018, see Archive Storage General Availability Pricing.
  • Seamless Integration: Customers use the same familiar operations on objects in the Archive tier as on objects in the Hot and Cool access tiers. This will enable customers to easily integrate the new access tier into their applications.
  • Durability: All access tiers including Archive are designed to offer the same high durability that customers have come to expect from Azure Storage with the same data replication options available today.
  • Security: All data in the Archive access tier is automatically encrypted at rest using 256-bit AES encryption, one of the strongest block ciphers available.
  • Global Reach: Archive Storage is available today in 14 regions – North Central US, South Central US, East US, West US, East US 2, Central US, West US 2, West Central US, North Europe, West Europe, Korea Central, Korea South, Central India, and South India.

Blob-Level Tiering: easily optimize storage costs without moving data

To simplify data lifecycle management, we now allow customers to tier their data at the object level. Customers can easily change the access tier of a single object among the Hot, Cool, or Archive tiers as usage patterns change, without having to move data between accounts. Blobs in all three access tiers can co-exist within the same account.

Flexible management

Archive Storage and Blob-Level Tiering are available on both new and existing Blob Storage and General Purpose v2 (GPv2) accounts. GPv2 accounts are a new account type that support all our latest features, while offering support for Block Blobs, Page Blobs, Files, Queues, and Tables. Customers with General Purpose v1 (GPv1) accounts can easily convert their accounts to a General Purpose v2 account through a simple 1-click step (Blob Storage account conversion support coming soon). GPv2 accounts have a different pricing model than GPv1 accounts, and customers should review it prior to using GPv2 as it may change their bill. See Azure Storage Options to learn more about GPv2, including how and when to use it. 

A user may access Archive and Blob-Level Tiering via the Azure portal (Figure 1), PowerShell, and CLI tools and REST APIs, .NET (Figure 2), Java, Python, or Node.js client libraries.

image
Figure 1: Set blob access tier in portal

 

CloudBlockBlob blob = (CloudBlockBlob)items;
blob.SetStandardBlobTier(StandardBlobTier.Archive);

Figure 2: Set blob access tier using .NET client library

Partner Integration

We integrate with a broad ecosystem of partners to jointly deliver solutions to our customers. The following partners support Archive Storage:

imageCommvault’s Windows/Azure Centric software solution enables a single solution for storage-agnostic, heterogeneous enterprise data management. Commvault’s native support for Azure, including being one of the first Windows/ISV to be "Azure Certified" has been a key benefit for customers considering a Digital Transformation to Azure. Commvault remains committed to continuing our integration and compatibility efforts with Microsoft, befitting a close relationship between the companies that has existed for over 17 years. This includes quick, cost effective and efficient movement of data to Azure while enabling indexing such that our customers can proactively use the data we send to Azure, including "Azure Archive". With this new Archive Storage offering, Microsoft again makes significant enhancements to their Azure offering and we expect that this service will be an important driver of new and expanding opportunities for both Commvault and Microsoft.

imageNetApp® AltaVaultTM cloud-integrated storage enables customers to tap into cloud economics and securely backup data to Microsoft Azure cloud storage at up to 90% lower cost compared with on-premises solutions. AltaVault’s modern storage architecture optimizes data using class-leading deduplication, compression, and encryption. Optimized data is written to Azure blob storage, reducing WAN bandwidth requirements and ensuring maximum data security. By adding Day 1 support for Azure Archive storage, AltaVault provides organizations access to the most cost-effective Azure blob storage tier, significantly driving down costs for rarely accessed long term backup and archive datasets. Try AltaVault’s free 90-day trial and see how easy it is to leverage Microsoft Azure Archive cloud storage today.

imageHubStor is a cloud archiving platform that converges long-term retention and data protection for on-premises file servers, Office 365, email, and other sources of unstructured data content. Delivered as Software-as-a-Service (SaaS) exclusively on the Azure cloud platform, IT teams are adopting HubStor to understand, secure, and manage large volumes of data in Azure with policies for classification, indexing, WORM retention, deletion, and tiering. As detailed in this post, customers can now apply HubStor’s built-in file analytics and storage tiering policies with the new Azure Archive Blob Storage tier to place the right data on the optimal tier at the best time in the information lifecycle. Enterprise Strategy Group recently completed a lab validation report on HubStor which you can download here.

imageThe purpose of CloudBerry Backup for Microsoft Azure is automating data upload to Microsoft Azure cloud storage. It is able to compress and encrypt the data with a user-defined password prior to the upload. It then securely transfers it to the cloud either on schedule or in real time. CloudBerry Backup also comes with file-system and image-based backup, SQL Server and MS Exchange support, as well as flexible retention policies and incremental backup. CloudBerry Backup now supports Microsoft Azure Archive Blob Storage for storing backup and archival data.

imageArchive2Azure, the intelligent data management and compliance archiving solution, provides customers a native Azure archiving application. Archive2Azure enables companies to provide automated retention, indexing on demand, encryption, search, review, and production for long term archiving of their compliance, active, low-touch, and inactive data from within their own Azure tenancy. This pairing of the Azure Cloud with Archive2Azure’s archiving and data management capabilities provides companies with the cloud-based security and information management they have long sought. With the general availability of Azure’s much anticipated Archive Storage offering, the needed security and lower cost to archive and manage data for extended periods is now possible. With the availability of the new Archive Storage offering, Archive2Azure can now offer Azure’s full range of storage tiers providing users a wide choice of storage performance and cost.

image[Archive support coming soon] Cohesity delivers the world’s first hyper-converged storage system for enterprise data. Cohesity consolidates fragmented, inefficient islands of secondary storage into an infinitely expandable and limitless storage platform that can run both on-premises and in the public cloud. Designed with the latest web-scale distributed systems technology, Cohesity radically simplifies existing backup, file shares, object, and dev/test storage silos by creating a unified, instantly-accessible storage pool. The Cohesity platform will support Azure Archive Storage for the following customer use cases: (i) long-term data retention for infrequently accessed data that require cost effective lowest priced blob storage, (ii) blob-level tiering functionality among Hot, Cool and Archive tiers, and (iii) ease of recovery of data from cloud back to on-premise independent of which Azure blob tier the data is in. Note that Azure Blob storage can be easily registered and assigned via Cohesity’s policy-based administration portal to any data protection workload running on the Cohesity platform.

imageIgneous Systems delivers the industry’s first secondary storage system built to handle massive file systems. Offered as-a-Service and built using a cloud-native architecture, Igneous Hybrid Storage Cloud provides a modern, scalable approach to management of unstructured file data across datacenters and public cloud, without the need to manage infrastructure. Igneous supports backup and long-term archiving of unstructured file data to Azure Archive Blob Storage, enabling organizations to replace legacy backup software and targets with a hybrid cloud approach.

image[Archive support coming soon] Rubrik orchestrates all critical data management services – data protection, search, development, and analytics – on one platform across all your Microsoft applications. By adding integration with Microsoft Azure Archive Storage Tier, Rubrik will complete support for all storage classes of Azure. With Rubrik, enterprises can now automate SLA compliance to any class in Azure with one policy engine and manage all archival locations in a single consumer-grade interface to meet regulatory and legal requirements. Leverage a rich suite of API services to create custom lifecycle management workflows across on-prem to Azure. Rubrik Cloud Data Management was architected from the beginning to deliver cloud archival services with policy-driven intelligence. Rubrik has achieved Gold Cloud Platform competency and offers end-to-end coverage of Microsoft technologies and services (physical or virtualized Windows, SQL, Hyper-V, Azure Stack, and Azure).

New Azure management and cost savings capabilities

The content below is taken from the original ( New Azure management and cost savings capabilities), to continue reading please visit the site. Remember to respect the Author & Copyright.

Enterprise customers choose Azure because of the unique value it provides as a productive, hybrid, intelligent and trusted cloud. Today I’m excited to announce four new management and cost savings capabilities. Azure Policy, now in public preview, provides control and governance at scale for your Azure resources. Azure Cost Management is rolling out the support for Azure Virtual Machine Reserved Instances management later this week to help you maximize savings over time.. To continue our commitment to making Azure cost-effective, we are reducing the prices of up to 4% on our Dv3 Series in several regions in the coming days, and making our lowest priced Storage tier Azure Archive Storage generally available today.

Simple ways to ensure a secure and well-managed cloud infrastructure

Azure is committed to providing a secure cloud foundation, while making available a comprehensive set of services to ensure that your cloud resources are secure and well-managed. Cloud security and management is a joint responsibility between Microsoft and the customer. We recommend that customers follow secure and well-managed cloud best practices for every production virtual machine. To help you achieve this goal, Azure has built-in services that can be configured quickly, are always up to date and are tightly integrated into the Azure experience. Take advantage of Azure Security Center for security management and threat protection, back up data to protect against ransomware and human errors with Azure Backup, and keep your applications running with Azure Monitor and Log Analytics. Check out the new poster that describes the Azure security and operations management services.

Enterprise customers have asked for better ways to help them manage and secure cloud resources at scale to accelerate cloud adoption. Azure Policy allows you to turn on built-in policies or build your own custom policies to enable company-wide governance. For example, you can set your security policy for your production subscription once and apply that policies to multiple subscriptions. I am happy to announce that Azure Policy is now in public preview.

Most value for every cloud dollar spent

With Azure Cost Management, Azure is the only platform that offers an end-to-end cloud cost management and optimization solution to help customers make the most of cloud investment across multiple clouds. Cost Management is free to all customers to manage their Azure spend. We are continuing to invest in bringing new capabilities to Cost Management. I am excited to announce that Cost Management supports Azure Reserved Virtual Machine Instances management starting December 15th.

In Azure, we have a long standing promise of making our prices comparable with AWS on commodity services such as compute, storage, and bandwidth. In keeping with this commitment, we are happy to announce price reductions of up to 4% on our latest general-purpose virtual machines, Dv3 Series in US West 2, US East and Europe North. These prices will take effect on January 5th.

We often hear customers are looking to the cloud for cost-effective ways to manage and store their infrequently accessed data for use cases like backup and archiving. Today, we’re announcing general availability of Azure Archive Storage, our lowest priced Storage tier yet. You can learn more details here.

Azure is the most cost-effective cloud for Windows Server workloads. If you are a Windows Server customer with Software Assurance, you can combine Azure Reserved Instances (RIs) with Azure Hybrid Benefits and save up to 82% compared to pay-as-you-go prices, and up to 67% compared to AWS RIs for Windows VMs. In addition, with Azure Hybrid Benefit for SQL Server, customers with Software Assurance will be able to save even more.

There are many other ways to save money with Azure. To learn more, check out the new Azure Cost Savings infographic below.

AzureCostSavings_InfographicStoryboard_v9_JJ

Azure provides the broadest set of security and management capabilities built into a public cloud platform. With these capabilities, customers can more easily secure and manage hybrid infrastructure resources while achieving significant cost savings. Activate Security Center, Backup, Log Analytics and Cost Management today to ensure a secure and well-managed cloud infrastructure with optimized efficiency.

Lack of Migration Tools Can Cause Problems Moving to Teams

The content below is taken from the original ( Lack of Migration Tools Can Cause Problems Moving to Teams), to continue reading please visit the site. Remember to respect the Author & Copyright.

No-one Likes Migrations but Everyone Loves Teams

Let’s say that you decide to embrace Microsoft Teams and move some email traffic to the new platform. Or that you want to move away from a competing chat platform like Slack or HipChat because Teams is part of your Office 365 plan and better integrated with other Office 365 applications, or because Teams is taking over from Skype for Business Online. The question might then arise whether you need to move any information from your current platform to Teams.

In some cases, the answer is no, and you can start with a clean slate. Users finish up whatever they are working on with the old platform before moving to Teams. In other cases, the old platform holds corporate knowledge or other essential information (like records needed for compliance) that you must preserve before you can decommission that platform.

Moving Email to Teams

Email includes personal mailboxes, shared mailboxes, and site mailboxes. Because email exists alongside Teams, there is often no need to move anything unless you have an important message or attachment that must be in Teams. In this case, because you cannot drag and drop items from an email client into Teams, the easiest solution is to email it to the channel that you want it to be in.

Unless blocked by tenant settings, each channel has a unique email address in the form [email protected]. To get the address, click the ellipsis menu for the channel. You can then copy the email address revealed by Teams (Figure 1) and use it to send whatever information you want to the channel. The technique works for any application capable of sending email via SMTP.

Teams email address

Figure 1: Email address for a channel (image credit: Tony Redmond)

Teams stores copies of received messages and any attachments in the Email Messages folder for the channel in the SharePoint document library used by the team. Emailing individual items is tiresome if you must process more than a few items, but it is effective.

Moving Documents to Teams

The thing to remember about documents is that Teams uses SharePoint for its document management. Each team has a SharePoint team site with a document library and each channel in the team has a separate folder. Anything you can do to move documents around for SharePoint applies to Teams.

You can use email documents to move them to Teams, but if the documents are already in a SharePoint library, it is better to create a tab to bring people to the library. If the team members have access to the target library, they can work with the files stored there through Teams.

First, get the URL for the target library by accessing it with a browser and copying the URL. Then, create a new SharePoint tab and insert the link you copied (Figure 2). Give the tab a name that tells users what the library holds.

Teams SharePoint Tab

Figure 2: Linking Teams to a SharePoint Library (image credit: Tony Redmond)

Linking a team to an existing SharePoint library might be a good way to move away from the now-deprecated site mailboxes. That is, unless you need offline access.

To move documents from other SharePoint libraries to those used by Teams, you can use SharePoint’s Move function. However, if you have hundreds of documents to move, it might be easier to synchronize both libraries with the OneDrive sync client and then copy whatever you need from the target library to the Teams library.

If you want to move documents from file servers or SharePoint on-premises servers, a range of third-party tools are available from ISVs such as ShareGate, Metalogix, and AvePoint. Alternatively, Microsoft announced their own SharePoint Migration Tool at Ignite 2017 (the tool is still in preview).

Moving Conversations to Teams

Apart from Teams, tenants can use Office 365 Groups and Yammer Groups for collaboration. Apart from emailing selected items, there is no way to move conversations hosted by these platforms to Teams.

Moving from Other Chat Platforms to Teams

Those who want to move information from other chat platforms might be out of luck. Although it is possible to extract data from platforms like HipChat and Slack, the issue is how to import that data into Teams. Documents are self-contained and can go into SharePoint; messages are a different matter.

To date, Microsoft has not created an API to import messages into Teams, perhaps because of the problems involved in taking information from different platforms and bringing that data into Teams in a way that the data is useful.

Processing imported can be complex. For instance, assume you export messages from another platform. Unless you want to dump the messages as individual items into Teams, you might want to check date formats, connect the messages belonging to a conversation together so that Teams displays them as a conversation, and matched user names against Azure Active Directory and team membership. This issue is not unique to Teams as some fix-up processing is usually needed whenever you move data between platforms.

The DIY Approach

Some light might be on the horizon in a GitHub project called “Channel Surf,” intended to help companies move from Slack to Teams. Although the author (Tam Huynh) is a Microsoft employee, this is a community project. As Tam explains in a post in the Microsoft Tech Community, the aim is to allow you to recreate an existing Slack channel structure in Teams.

The basic idea is that you create a Slack archive and use the data in the archive to recreate the channels in Teams. To populate the channel, the code generates HTML files for the Slack messages and copies them to Teams. Any code also copies any attachments found in Slack to Teams.

Tam notes that he used public APIs in the project and that some functions needed to perform a true message import are not yet available. However, it is a work in progress that anyone can get involved with to improve.

No ISVs Support Teams Migration

Many ISVs offer migration products for Office 365 and it is surprising that none (that I can find) have any solution for Teams migration more than a year after Teams appeared in preview. The likely reason is the lack of a supported public API to allow ISVs to perform the necessary fix-up when moving data from different platforms into Teams.

Sponsored

Migration Might Not be a Road Block

Although no migration tools are available for Teams today, I do not think this is a road block for deployment. You can move high-value items like documents into SharePoint relatively easily and email individual messages if necessary. In the absence of tools to move conversations from Slack or other chat platforms, you can either wait for the market to mature and migration tools to appear or consider a cutover migration. It’s an imperfect situation for now.

Follow Tony on Twitter @12Knocksinna.

Want to know more about how to manage Office 365? Find what you need to know in “Office 365 for IT Pros”, the most comprehensive eBook covering all aspects of Office 365. Available in PDF and EPUB formats (suitable for iBooks) or for Amazon Kindle.

The post Lack of Migration Tools Can Cause Problems Moving to Teams appeared first on Petri.

A humanoid robot carried the Olympic torch in South Korea

The content below is taken from the original ( A humanoid robot carried the Olympic torch in South Korea), to continue reading please visit the site. Remember to respect the Author & Copyright.

One of the traditions of the Olympics is the torch relay, in which people carry the flame from Olympia, Greece to the location of the Games. In 2018, the Olympic Games will be held in Pyeongchang, South Korea, and the torch relay is currently underwa…

How XRP Stacks Up Against Other Digital Assets

The content below is taken from the original ( How XRP Stacks Up Against Other Digital Assets), to continue reading please visit the site. Remember to respect the Author & Copyright.

Everyone is talking about the digital asset space. Wild price fluctuations, new XRP capital funds and Bitcoin (BTC) forks have made it virtually impossible for consumers or the financial industry to ignore the popularity and proliferation of these assets.

In fact, the digital asset landscape has grown so much that there are more than 1,300 types of assets on the market with a collective market cap of $450 billion as of this posting.

We examined the top digital assets for payments while comparing their speed, cost and scalability. Here’s what we found.

 

Which digital asset is the best for payments?

XRP is the only digital asset with a clear institutional use case designed to solve a multi-trillion dollar problem – the global payment and liquidity challenges that banks, payment providers and corporates face. In order to effectively solve this problem, speed, cost and scalability are of extreme importance. When you line up the top digital assets for these attributes, it’s clear that XRP is the winner.

XRP is part of a bigger vision

XRP is a key enabler of the Internet of Value — Ripple’s vision for making money move at the speed of digital information. XRP’s speed, transparency, and scalability help financial institutions move money like information moves today — in real time.

It’s no wonder that real institutional customers are using and finding value in XRP and governments, regulators and central banks are increasingly recognizing the role it could play in the global system.

For more information about XRP, visit our XRP Charts or learn how to buy XRP.

The post How XRP Stacks Up Against Other Digital Assets appeared first on Ripple.

Google app experiments push the limits of phone photography

The content below is taken from the original ( Google app experiments push the limits of phone photography), to continue reading please visit the site. Remember to respect the Author & Copyright.

Google doesn't want to limit its photographic prowess to its own phones — it just released an initial batch of "appsperiments" that use the company's knack for computer vision and other technologies to test the boundaries of phone photography. Stor…

Roller Coaster Tycoon IRL

The content below is taken from the original ( Roller Coaster Tycoon IRL), to continue reading please visit the site. Remember to respect the Author & Copyright.

Additive manufacturing has come a long way, but surely we’re not at the point where we can 3D-print a roller coaster, right? It turns out that you can, as long as 1/25th scale is good enough for you.

Some people build model railroads, but [Matt Schmotzer] has always had a thing for roller coasters. Not content with RollerCoaster Tycoon, [Matt] decided to build an accurate and working model of Invertigo, a boomerang coaster at King’s Park, the coaster nirvana in Cincinnati, Ohio. Covering a sheet of plywood and standing about 3′ tall, [Matt]’s model recreates the original in painstaking detail, from the supporting towers and bracing to the track sections themselves. It appears that he printed everything in sections just like the original was manufactured, with sections bolted together. Even though all the parts were sanded and vapor smoothed, the tracks themselves were too rough to use, so those were replaced with plastic tubing. But everything else is printed, and everything works. An Arduino Mega controls the lift motors, opens and closes the safety bars on the cars, and operates the passenger gates and drop floor in the station. The video below shows it in action.

Fancy a coaster of your own, but want something a little bigger? We understand completely.

Filed under: 3d Printer hacks

Windows 10 All-In-One For Dummies, 2nd Edition ($19 Value) FREE For a Limited Time

The content below is taken from the original ( Windows 10 All-In-One For Dummies, 2nd Edition ($19 Value) FREE For a Limited Time), to continue reading please visit the site. Remember to respect the Author & Copyright.

windows10-all-in-one-for-dummies

The 2nd edition of Windows 10 All-In-One for Dummies is available for free for a limited period of time to TWC readers. The eBook brings a detailed guide to learn more about the latest Windows update and make the most […]

This post Windows 10 All-In-One For Dummies, 2nd Edition ($19 Value) FREE For a Limited Time is from TheWindowsClub.com.

Managing File Associations in Windows 10

The content below is taken from the original ( Managing File Associations in Windows 10), to continue reading please visit the site. Remember to respect the Author & Copyright.

In this Ask the Admin, I’ll explain the changes to how default app file associations are managed in Windows 10.

 

 

Windows 10 helps to prevent file associations being hijacked. In previous versions of Windows, when new applications are installed or run for the first time, it often checks to see if the new app is the default app for a given file type. For example, a browser might check to see if it is the default program for opening HTML files. If the developer followed best practices, the user would be asked to provide consent before the app was set as default. Some developers skipped this step and changed the file association settings in the registry without the user’s consent.

 

The UI for managing file associations in Windows 10 aims to put the user in control. Neither Store apps nor win32 apps can invoke a prompt asking for consent to change app defaults. Now in Windows 10, a notification is displayed when a user launches a file type that has multiple programs registered as handlers. When a new application registers an extension, unless the user previously checked the ‘Always use this app to open .doc files’ box, the notification will popup for the given file type. If a win32 app tries to invoke a consent prompt for changing app defaults in Windows 10, you will receive a warning that you’ll need to change the default settings in the Settings app under Apps > Default apps.

Changing default apps in Windows 10 version 1709 (Image Credit: Russell Smith)

Changing Default Apps in Windows 10 Version 1709 (Image Credit: Russell Smith)

 

To protect against win32 apps that bypass the user-consent prompt when changing default apps settings, Windows 10 monitors the registry. It will reset the default app for the changed file association. So, if you use scripts to modify the default-app file associations in the registry in Windows 10, users will receive app reset notifications.

Programmatically Configure File Associations in Windows 10

Microsoft recommends using the dism command-line tool to export default file associations on a reference computer. There are three steps:

  1. Install your apps on a reference computer running Windows 10.
  2. Log in as a local administrator, open the Settings app, and set up your default apps under Apps > Default apps.
  3. Export the file associations using dism. Make sure that you are using the same local administrator account that was used to set the default apps.
dism.exe /online /export-defaultappassociations:c:\temp\customfileassoc.xml

Exporting default app settings using dism in Windows 10 (Image Credit: Russell Smith)

Exporting Default App Settings Using dism in Windows 10 (Image Credit: Russell Smith)

 

The file will contain default-app settings for all file associations configured on the device. It’s important not to remove these from the file, otherwise, users will receive reset notifications when they log on. When a new version of Windows is released, you should repeat the process above because the built-in applications might register new file associations.

Default app settings exported as an XML file in Windows 10 (Image Credit: Russell Smith)

Default App Settings Exported as an XML File in Windows 10 (Image Credit: Russell Smith)

 

The .xml file created above can be used to modify your Windows 10 reference image:

dism.exe /online /import-defaultappassociations:c:\temp\customfileassoc.xml

 

You can also use the same file to deploy the default apps settings using Group Policy. The required setting is Set a default associations configuration file under Computer Configuration > Administrative Templates > Windows Components > File Explorer.

Deploy default apps settings using Group Policy in Windows 10 (Image Credit: Russell Smith)

Deploy Default Apps Settings Using Group Policy in Windows 10 (Image Credit: Russell Smith)

 

If you use dism to change a Windows 10 image, file associations will only be applied to new user profiles and users can change the file associations. If you use Group Policy, the file association settings will be applied each time a user logs on. You can combine these two methods to force file associations for some file types by using Group Policy. This allows users to change others by importing file associations using dism.

 

Sponsored

 

In this Ask the Admin, I showed you how to manage file associations (default apps) in Windows 10. For more detailed instructions, see Microsoft’s website here.

The post Managing File Associations in Windows 10 appeared first on Petri.

Azure Application Architecture Guide

The content below is taken from the original ( Azure Application Architecture Guide), to continue reading please visit the site. Remember to respect the Author & Copyright.

We’ve talked to many customers since Azure was released nearly eight years ago. Back then, Azure had only a few services. Now it’s grown tremendously and keeps expanding. Cloud computing itself also has evolved to embrace customer demands. For example, most consumer-facing apps require a much faster velocity of updates than before, to differentiate them from competitors. That’s part of the reason why new architecture styles such as microservices are gaining traction today. Container-based and serverless workloads are becoming de facto. We see all of these new services and industry trends as a great opportunity, but at the same time, they can be a source of confusion for customers. Customers have a lot of questions, such as:

  • Which architecture should I choose? Microservices? N-Tier? How do we decide?
  • There are many storage choices, which one is the best for me?
  • When should I use a serverless architecture? What’s the benefit? Are there any limitations?
  • How can I improve scalability as well as resiliency?
  • What’s DevOps culture? How can I introduce it to my organization?

To help answer these questions, the AzureCAT patterns & practices team published the Azure Application Architecture Guide. This guide is intended to provide a starting point for architects and application developers who are designing applications for the cloud. It guides the reader to choose an architectural style, then select appropriate technologies and apply relevant design patterns and proven practices. It also ties together much of the existing content on the site. The following diagram shows the steps in the guide along with the related topics.

Architecture guide steps

 

Architecture styles. The first decision point is the most fundamental. What kind of architecture are you building? It might be a microservices architecture, a more traditional N-tier application, or a big data solution. We have identified seven distinct architecture styles. There are benefits and challenges to each.

Technology Choices. Two technology choices should be decided early on, because they affect the entire architecture. These are the choice of compute and storage technologies. The term compute refers to the hosting model for the computing resources that your applications runs on. Storage includes databases but also storage for message queues, caches, IoT data, unstructured log data, and anything else that an application might persist to storage.

Design Principles. Throughout the design process, keep these ten high-level design principles in mind.

Pillars. A successful cloud application will focus on these five pillars of software quality: Scalability, availability, resiliency, management, and security.

Cloud Design Patterns. These design patterns are useful for building reliable, scalable, and secure applications on Azure. Each pattern describes a problem, a pattern that addresses the problem, and an example based on Azure.

This guide is also available for download as an ebook.

We hope you will find the Azure Application Architure Guide useful. Lastly, we value your feedback and suggestions. If you see anything that is missing in the content, suggestions for improvements, or want to share information that has worked well for your customers and could be elevated to a broader audience, please contact us at [email protected].

Microsoft Brings Advanced Threat Protection to SharePoint Online

The content below is taken from the original ( Microsoft Brings Advanced Threat Protection to SharePoint Online), to continue reading please visit the site. Remember to respect the Author & Copyright.

More Anti-Malware Protection for SharePoint Online

Microsoft’s December 5 announcement that Advanced Threat Protection (ATP) is now available for “SharePoint, OneDrive for Business, and Microsoft Teams” generated a lot of excitement. And so it should, because increasing the level of protection against malware infections is always a good thing. And given that more people than ever before are storing files in the cloud, making sure that the bad stuff is detected and stopped in place is sensible.

The Cost of Protection

My excitement faded a little when I read the blog to find that ATP for SharePoint (for that is what this is) requires tenants to buy either Office 365 E5 licenses or separate add-on licenses, Microsoft is steadily increasing the number of features that you can only use if you have E5 grow the average revenue per seat. When you have 120 million active users, Office 365 can drive a lot of extra dollars.

How ATP for SharePoint Works

ATP for SharePoint works differently to ATP for email, which is also part of Office 365 E5. The Exchange Online mail transport service is a single chokepoint where email moving into and out of a tenant must pass through. ATP for email examines inbound messages to detect and remove malware, using techniques like Dynamic Delivery to process potentially suspicious attachments. Overall, ATP works well for email.

But SharePoint does not have a single chokepoint where content can be checked. This is the same problem that the Data Loss Prevention (DLP) team ran into when they wanted to check documents for sensitive data types like credit card numbers or social security numbers. DLP does not try to check every file to detect sensitive data and concentrates instead on preventing users sharing sensitive data when they should not. Likewise, ATP “does not scan every single file.” Instead, ATP scans files asynchronously as users share files to detect possible problems.

Of course, this is a simple description of what is very complex processing. ATP uses other inputs to decide when sharing documents might cause issues, including “smart heuristics” and intelligence about potential threats. The implementation is reasonable because although you might want to scan every file stored in your SharePoint Online and OneDrive for Business sites, such an operation would consume huge resources.

Stopping Malware Spreading

Scanning every file in Office 365 might catch some problematic files, but if those files are in place and doing nothing they cannot do any harm. Problems only happen when people share files with others and spread infection to places where someone might open and activate malicious content.

When ATP for SharePoint finds malware in a file, it locks the file and stops users from opening, downloading or sharing it. The file remains in the library and the user can delete it. Administrators can keep an eye on what’s happening with the Threat protection status report in the Security and Compliance Center, where they can see a report of all files detected and blocked by ATP for SharePoint.

In addition, a new “detected malware in file” audit event is logged by ATP when it detects malware. Like other audit events, you can search the Office 365 audit log to find problems or create an activity alert or activity policy to email you automatically when ATP logs this audit event (Figure 1).

Activity Alert malware

Figure 1: Creating an activity alert for malware detection (image credit: Tony Redmond)

Turning on ATP for SharePoint

Once you have the necessary licenses, enabling ATP for SharePoint is easy. Go to the Security and Compliance Center, open Threat Protection, then Policy, select Safe attachments, and set the checkbox to turn on ATP (Figure 2).

ATP for SharePoint Online

Figure 2: Enabling ATP for SharePoint Online (image credit: Tony Redmond)

As noted in Microsoft’s instructions, the new policy must be published to all of the datacenters which hold your tenant data. That process can take up to 30 minutes as an Office 365 datacenter region spans multiple datacenter (EMEA, for instance, has four in Helsinki, Dublin, Amsterdam, and Vienna).

Don’t expect a flood of notifications to happen after you enable ATP for SharePoint. Remember that checking centers around file sharing, so you should only expect to see events when people try to share infected content.

ATP for Teams?

Teams is Microsoft’s current poster child application for Office 365, so it is no surprise when marketing tries to include Teams into every announcement. In this case, every team has a SharePoint team site and document library, so every team can store documents, and therefore every team can take advantage of ATP for SharePoint.

It is a tenuous but valid connection. People using teams to store files might not even know that they are using SharePoint, but they are. And because they store attachments and other files in SharePoint, they gain protection when ATP is active. If ATP blocks a suspect file, users cannot download or open that file with Teams.

Sponsored

Protection for a World of Threat

Commenting on ATP,  Dana Simberkoff, Chief Compliance, Risk and Information Security Officer at AvePoint said: “It’s important for organizations to implement a layered approach to security. Improperly relying on ATP alone, or any other existing scanning technology, may create a false sense of safety in that most costly breaches come from simple failures, and not from attacker ingenuity.”

I agree. ATP is not a panacea that will suddenly eradicate malware from SharePoint.

The thing about security is that threat evolves all the time. You can’t stay still and expect to remain invulnerable when new threats come along. One of the nice things about using Office 365 is that you gain from the investments Microsoft makes in technologies like ATP. That is, if you can afford the licenses.

Follow Tony on Twitter @12Knocksinna.

Want to know more about how to manage Office 365? Find what you need to know in “Office 365 for IT Pros”, the most comprehensive eBook covering all aspects of Office 365. Available in PDF and EPUB formats (suitable for iBooks) or for Amazon Kindle.

 

The post Microsoft Brings Advanced Threat Protection to SharePoint Online appeared first on Petri.

Restore Open command window here item to Windows 10 folder context menu

The content below is taken from the original ( Restore Open command window here item to Windows 10 folder context menu), to continue reading please visit the site. Remember to respect the Author & Copyright.

Not long back, the option to use Command Prompt was available at various spots in the Windows OS environment. You could open the Command Prompt in any folder by holding down the Shift key and then right-clicking to see the Open […]

This post Restore Open command window here item to Windows 10 folder context menu is from TheWindowsClub.com.

Deployment strategies defined

The content below is taken from the original ( Deployment strategies defined), to continue reading please visit the site. Remember to respect the Author & Copyright.

This post is authored by Itay Shakury, Cloud Solution Architect.

In the world of devops practices and cloud native infrastructure, the concept of deployments has evolved from an uninteresting implementation detail, to a fundamental element for modern systems. It seems there’s a general understanding of its importance, and work is being done to build solutions and tools for better deployment practices, but we never paused to agree and define on what are the deployment strategies that are important, and how to define them. It’s not uncommon to see people use different terms for same meanings, or same terms for different meanings. This leads to other people reinventing the wheel trying to solve their own problems. We need a common understanding of this topic in order to build better tools, make better decisions, and simplify communication with each other.

This post is my attempt to list and define those common deployment strategies, which I called:

  • Reckless Deployment
  • Rolling Upgrade
  • Blue/Green Deployment
  • Canary Deployment
  • Versioned Deployment

There are probably other names and terms you expected to see on this list. I’d argue that those missing terms can be seen as variants of these primary strategies.

Note: The post is about definitions, methodologies and approaches and not about how to implement them in any technology stack. A technical tutorial will follow in another post. Stay tuned!

Read the full essay here.

Cloud Standards Customer Council Announces Version 3.0 of Practical Guide to Cloud Computing

The content below is taken from the original ( Cloud Standards Customer Council Announces Version 3.0 of Practical Guide to Cloud Computing), to continue reading please visit the site. Remember to respect the Author & Copyright.

The Cloud Standards Customer Council (CSCC), an end user advocacy group dedicated to accelerating cloud’s successful adoption, today published… Read more at VMblog.com.

Microsoft launches Windows 10 on ARM, with HP and ASUS promising 20+ hours of battery life

The content below is taken from the original ( Microsoft launches Windows 10 on ARM, with HP and ASUS promising 20+ hours of battery life), to continue reading please visit the site. Remember to respect the Author & Copyright.

 Windows laptops and tablets have traditionally run on X86 processors from the likes of Intel and AMD. Microsoft experimented with using ARM-based processors when it launched the Surface RT and Windows RT in 2012 — and it cost the company dearly. Fast-forward to today and Microsoft is ready to give ARM on laptops another try. Read More