The content below is taken from the original (8 tips for keeping your data safe with Identity and Access Management), to continue reading please visit the site. Remember to respect the Author & Copyright.
Build a Raspberry Pi-Powered, Alexa-Infused Alarm Clock
The content below is taken from the original (Build a Raspberry Pi-Powered, Alexa-Infused Alarm Clock), to continue reading please visit the site. Remember to respect the Author & Copyright.
Your phone is probably the smartest alarm clock you’ve ever owned, but if you’re looking for a project that’s a little more playful, Nick Triantafillou shares a smart alarm clock on Hackster.io that integrates Alexa, If This Then That, and more.
Finding compatible USB-C accessories is a crapshoot
The content below is taken from the original (Finding compatible USB-C accessories is a crapshoot), to continue reading please visit the site. Remember to respect the Author & Copyright.
By Andrew E. Freedman
USB Type-C is mainstream now. HP and Apple put the new standard on their high-end laptops exclusively, while a number of others are using both USB 3.0 and USB-C. Samsung added USB Type-C to the Samsung Galaxy Note 7, which ensures that we’ll see it on a ton of phones going forward.
In theory, this should be awesome. You can use USB Type-C for charging, transferring data, putting video on external monitors, listening to music and more. And, most famously, it’s reversible.
But using USB Type-C with third-party accessories hasn’t proven to be the seamless experience it should be. I’ve been trying out a number of USB Type-C docks, chargers and other peripherals, only to find that, when it comes to charging and Alternate Mode for video, they work on a case-by-case basis (we haven’t had the same issues with data transfer).
I can’t recommend that you out and buy USB-C chargers and alt-mode docks immediately, unless you know for sure that what you’re buying will work with your specific laptop or tablet.
Oh God, the docks.
Take, for example, the Innergie PowerGear USB-C 45 universal charger. We really like this device, because it’s the first third-party power brick that promises to work with any USB Type-C laptop, tablet or phone. Unfortunately, thanks to some picky OEMs, it’s not so universal. For instance, HP’s notebooks block third-party chargers when they’re turned on — meaning you can’t guarantee that it will charge your notebook.
Then there are the alt-mode docks. Oh God, the docks.
Docks that use DisplayLink technology, which requires a driver and treats your video like regular USB data, don’t have trouble connecting to a monitor. However, docks that use USB Type-C’s Alternate Mode, or "alt-mode," which sends the video as DisplayPort or HDMI signals, are extremely inconsistent. The Dell DS1000, Kensington SD4600P, Plugable UD-CA1 and HP Elite Thunderbolt 3 simply didn’t work with everything we tested. HP limited its dock to work with its computers, while the DS1000, UD-CA1 and SD4600P were complete crapshoots. Depending on the laptop you plug into these docks, they may or may not charge, output video to monitors or display the resolutions promised.
Who is to blame? The dock vendor or the laptop vendor? Does the USB Implementers Forum have any oversight here?
"Alternate Mode specifications are developed by the respective standards organization or vendor and not the USB-IF," a spokeswoman for the group said in a statement to Laptop Mag. "For example, VESA has a DisplayPort Alternate Mode specification in support of USB Type-C, [the] MHL Consortium developed MHL Alt Mode for USB Type-C, and there are proprietary specifications like Intel’s Thunderbolt, which also has an Alt Mode spec in support of USB Type-C."
MORE: How to Buy USB Type-C Cables That Won’t Fry Your Gadgets
The USB-IF declined to comment on any specific products, but suggested that those who had issues should contact the vendors or industry groups that are implementing Type-C and alt-mode. This is why DisplayLink-powered docks are still a much better option, even though they require you to install a small software package.
In other words, we need vendors to consistently implement USB Type-C, alt-mode and everything else that comes with it, if we are to have any hope that our chargers, docks, hubs, adapters and adapters will work across devices.
But right now, that isn’t the case. While Dell’s laptops, like the XPS 13, have worked with almost every accessory that we’ve tested so far, others, like Lenovo’s, have worked with some, but not all.
For instance, the Lenovo ThinkPad 13 was my go-to for the SD4600P, but Dell’s DS1000 didn’t want to play nice with it. In the latter’s case, it wouldn’t work with two monitors simultaneously as promised, and some ports, like VGA and HDMI, didn’t work while using DisplayPort. I had to switch to a Dell XPS 13 to get the results I wanted.
HP appears to be one of the biggest roadblocks, often blocking third-party USB-C accessories, likely due to fear of what a poorly made or counterfeit charger could do to a device.
"Are we being too conservative?" HP vice president Mike Nash asked PCWorld late last year. "I don’t think so."
Just this week, the USB-IF announced the rollout of a USB Type-C charger certification program, where accessory makers will submit their power bricks for testing and be able to put a logo on them, showing that they are "safe." However, we don’t know whether this program will have any effect on laptop manufacturers and their desire to block third-party chargers.
Until we get to a point where all vendors use the same spec, users are out in the cold.
You never have to wonder whether the generic micro USB phone charger you picked up at the airport store will power your LG or Samsung phone. And you shouldn’t have to worry about whether the USB Type-C charger you got at the corner store will juice your HP, Dell, Lenovo or Asus laptop.
Until we get to a point where all vendors use the same spec, users are out in the cold. Should they buy the dock they need? Will it work or will it serve as an expensive paperweight instead of delivering power? Is it worth your money to get an accessory that may or may not work? Be sure to check the vendor’s compatibility list, if there is one.
USB Type-C is great. It’s the future. I want vendors to pick it up even more rapidly than they have. But for the new standard to meet its potential, everything you want to plug into it has to work, no matter what company’s computer or phone you’re buying. If you’re buying something with a driver to deliver data, you’ll probably be fine, but the promise is that everything — everything — will work out of the box. That’s not the case yet. Hopefully, it will be one day.
Until then, purchase with extreme caution. Or hold off.
More from Laptop Mag and Tom’s Guide:
Finding compatible USB-C accessories is a crapshoot
The content below is taken from the original (Finding compatible USB-C accessories is a crapshoot), to continue reading please visit the site. Remember to respect the Author & Copyright.
By Andrew E. Freedman
USB Type-C is mainstream now. HP and Apple put the new standard on their high-end laptops exclusively, while a number of others are using both USB 3.0 and USB-C. Samsung added USB Type-C to the Samsung Galaxy Note 7, which ensures that we’ll see it on a ton of phones going forward.
In theory, this should be awesome. You can use USB Type-C for charging, transferring data, putting video on external monitors, listening to music and more. And, most famously, it’s reversible.
But using USB Type-C with third-party accessories hasn’t proven to be the seamless experience it should be. I’ve been trying out a number of USB Type-C docks, chargers and other peripherals, only to find that, when it comes to charging and Alternate Mode for video, they work on a case-by-case basis (we haven’t had the same issues with data transfer).
I can’t recommend that you out and buy USB-C chargers and alt-mode docks immediately, unless you know for sure that what you’re buying will work with your specific laptop or tablet.
Oh God, the docks.
Take, for example, the Innergie PowerGear USB-C 45 universal charger. We really like this device, because it’s the first third-party power brick that promises to work with any USB Type-C laptop, tablet or phone. Unfortunately, thanks to some picky OEMs, it’s not so universal. For instance, HP’s notebooks block third-party chargers when they’re turned on — meaning you can’t guarantee that it will charge your notebook.
Then there are the alt-mode docks. Oh God, the docks.
Docks that use DisplayLink technology, which requires a driver and treats your video like regular USB data, don’t have trouble connecting to a monitor. However, docks that use USB Type-C’s Alternate Mode, or "alt-mode," which sends the video as DisplayPort or HDMI signals, are extremely inconsistent. The Dell DS1000, Kensington SD4600P, Plugable UD-CA1 and HP Elite Thunderbolt 3 simply didn’t work with everything we tested. HP limited its dock to work with its computers, while the DS1000, UD-CA1 and SD4600P were complete crapshoots. Depending on the laptop you plug into these docks, they may or may not charge, output video to monitors or display the resolutions promised.
Who is to blame? The dock vendor or the laptop vendor? Does the USB Implementers Forum have any oversight here?
"Alternate Mode specifications are developed by the respective standards organization or vendor and not the USB-IF," a spokeswoman for the group said in a statement to Laptop Mag. "For example, VESA has a DisplayPort Alternate Mode specification in support of USB Type-C, [the] MHL Consortium developed MHL Alt Mode for USB Type-C, and there are proprietary specifications like Intel’s Thunderbolt, which also has an Alt Mode spec in support of USB Type-C."
MORE: How to Buy USB Type-C Cables That Won’t Fry Your Gadgets
The USB-IF declined to comment on any specific products, but suggested that those who had issues should contact the vendors or industry groups that are implementing Type-C and alt-mode. This is why DisplayLink-powered docks are still a much better option, even though they require you to install a small software package.
In other words, we need vendors to consistently implement USB Type-C, alt-mode and everything else that comes with it, if we are to have any hope that our chargers, docks, hubs, adapters and adapters will work across devices.
But right now, that isn’t the case. While Dell’s laptops, like the XPS 13, have worked with almost every accessory that we’ve tested so far, others, like Lenovo’s, have worked with some, but not all.
For instance, the Lenovo ThinkPad 13 was my go-to for the SD4600P, but Dell’s DS1000 didn’t want to play nice with it. In the latter’s case, it wouldn’t work with two monitors simultaneously as promised, and some ports, like VGA and HDMI, didn’t work while using DisplayPort. I had to switch to a Dell XPS 13 to get the results I wanted.
HP appears to be one of the biggest roadblocks, often blocking third-party USB-C accessories, likely due to fear of what a poorly made or counterfeit charger could do to a device.
"Are we being too conservative?" HP vice president Mike Nash asked PCWorld late last year. "I don’t think so."
Just this week, the USB-IF announced the rollout of a USB Type-C charger certification program, where accessory makers will submit their power bricks for testing and be able to put a logo on them, showing that they are "safe." However, we don’t know whether this program will have any effect on laptop manufacturers and their desire to block third-party chargers.
Until we get to a point where all vendors use the same spec, users are out in the cold.
You never have to wonder whether the generic micro USB phone charger you picked up at the airport store will power your LG or Samsung phone. And you shouldn’t have to worry about whether the USB Type-C charger you got at the corner store will juice your HP, Dell, Lenovo or Asus laptop.
Until we get to a point where all vendors use the same spec, users are out in the cold. Should they buy the dock they need? Will it work or will it serve as an expensive paperweight instead of delivering power? Is it worth your money to get an accessory that may or may not work? Be sure to check the vendor’s compatibility list, if there is one.
USB Type-C is great. It’s the future. I want vendors to pick it up even more rapidly than they have. But for the new standard to meet its potential, everything you want to plug into it has to work, no matter what company’s computer or phone you’re buying. If you’re buying something with a driver to deliver data, you’ll probably be fine, but the promise is that everything — everything — will work out of the box. That’s not the case yet. Hopefully, it will be one day.
Until then, purchase with extreme caution. Or hold off.
More from Laptop Mag and Tom’s Guide:
Six futuristic data storage technologies
The content below is taken from the original (Six futuristic data storage technologies), to continue reading please visit the site. Remember to respect the Author & Copyright.
By Cat DiStasio
Digital technology is taking over the world, and scientists are hard at work finding better ways to store data — lots of it and for long periods of time. Scientists are exploring new materials for data storage as well as new methods for printing data on their chosen medium. While some companies are storing data on the ocean floor, other imagineers look upward, dreaming of giant storage skyscrapers. With so many different innovations happening in such a short period of time, the race is on to unlock the keys to near-limitless data storage potential.
5D Glass Data Disc
Data storage in five dimensions, embedded in nanostructures within glass discs, could inspire the next wave in record-keeping. A research team at the University of Southampton’s Optoelectronics Research Center (ORC) created a prototype the size of a quarter that can hold 360 terabytes of data and withstand extreme heat up to 190°C (374°F). The team believes their invention could be used to store data for up to 13.8 billion years (the age of the universe, FTW) because, unlike CDs and DVDs which hold their data on the surface and are prone to scratches, the 5D glass discs protect that information within their structure, safe from bumps and scrapes.
Underwater data centers
It’s no surprise that Microsoft has tons of data to keep secure, and last year the tech giant started experimenting with putting the "cloud" deep under water. Project Natick enclosed data servers in a huge watertight capsule and sunk them beneath the waves in the Pacific Ocean off the coast of Washington. After a two-month test period, the 38,000-pound steel container was brought back to the surface, where its contents — a data center with the computing power of 300 desktop PCs — were nice and dry. Learning that the concept works may lead Microsoft to someday install more underwater data centers, but for now it has no firm plans.
Iceland’s proposed data skyscraper
Still in the concept phase, this epic data-storing skyscraper was designed to be located in Iceland. The building would act as a giant cylindrical motherboard with a hollow center, allowing for plenty of natural air flow to keep the data servers cool — which shouldn’t be too big of a problem in a chilly place like Iceland. The country’s renewable energy infrastructure also means the tower could be powered by 100 percent clean energy. The design concept won third place in the 2016 eVolo Skyscraper competition, although there’s no telling if it will ever become a reality.
Hitachi’s Quartz Glass Disc
Electronics leader Hitachi produced another version of 5D glass data storage in 2012. Using binary code, researchers packed 40 megabytes of data into a one-square-inch piece of quartz glass (the type beakers are made from). At barely two millimeters thick, each square can hold the same amount of data as a CD and endure temps up to 1,832°F as well as run-ins with chemicals and water. The data can be retrieved with an optical microscope and, just like the glass panes we are all accustomed to, is perfectly transparent no matter how much data is etched into it.
Floating cantilever for low-energy devices
An international research team based in South Korea and Scotland developed a proof of concept for a new type of data storage that relies on a floating cantilever for small gadgets like cell phones and MP3 players. The self-propelled cantilever reacts to electrical currents within the device to convert this electrical information into binary code, and it’s both faster and more energy-efficient than existing technologies. Although the tech hasn’t made its way into consumer electronics yet, there’s still potential for the breakthrough to lead to more efficient data storage down the road.
Abandoned mines as data centers
While many researchers are hard at work developing new data storage devices, others are looking for better locations to put servers. Some say the answer is right beneath our feet, so to speak. Abandoned limestone mines across the country could be retrofitted into the perfect locations for underground data centers. An efficient data storage center will have a consistent cool temperature and humidity level — two requirements that lead to massive energy use above the ground. Deep inside a mine, however, the conditions are just right. At least one architectural firm, Callison, has already converted a former mine site to an underground data center somewhere in the Northeast United States, but the exact location is top secret. Talk about secure data storage.
Download the Windows 10 Bible Free
The content below is taken from the original (Download the Windows 10 Bible Free), to continue reading please visit the site. Remember to respect the Author & Copyright.
Download the Windows 10 Bible Free
Windows 10 Bible from Wiley is being offered as a free download for TWC readers. The eBook covers customization, content management, networking, hardware, performance, security, etc. If you are new to Windows 10, I am sure that you will find it of interest.
Windows 10 Bible eBook
Windows 10 and its built-in apps get several new features and it is quite possible that you may not be familiar with all. Whether it is Windows Ink, Voice Recorder, Edge, Narrator Cortana, Windows Defender, etc., – all these new features add to making Windows 10 what it is.
Windows 10 Bible is a reference guide that promises to cover all aspects of the Windows 10 operating system, and covers the following areas:
- Personalizing Windows 10
- Fine-tuning performance, connecting to a network, working with the cloud, etc.
- Managing content, media, software, and security
- Eliminating issues related to printing, faxing, and scanning
- And more.
Whether you’re starting from scratch or just looking to become more proficient, this guide is your ideal solution. You’ll learn just what Windows can do, and how to take full advantage so you can get more done faster.
Go here to download the eBook, which otherwise costs $ 32.99, free! You will be asked to enter your personal details & email ID, or log in with your social account.
Do share this post, if you think your friends too may be interested in this free offer.
If you are looking for more free eBooks, downloads or other freebies, go visit this link and see if anything interests you.
Radio Silence Stops Mac Apps From Phoning Home, Now Shows You Traffic In Real Time
The content below is taken from the original (Radio Silence Stops Mac Apps From Phoning Home, Now Shows You Traffic In Real Time), to continue reading please visit the site. Remember to respect the Author & Copyright.
Mac: We were big fans of Radio Silence when it initially launched several years ago because it was one of the easiest, cheapest way to keep an eye on Mac apps secretly phoning home. A recent update makes Radio Silence a bit easier to use, and it now shows you traffic in real time.
Like the original version of Radio Silence, you can simply pick from a list of applications and set up a block for them so they can’t transmit data back to the mothership. This update adds in a new monitor mode that shows you not only which apps are currently sending data, but where it’s going. This can help you keep an eye on apps that you might not realize are even transmitting data. You can check out a trial of Radio Silence before you decide whether or not you’d like to shell out the $9 for the full version.
Radio Silence ($9)
Over 90% of Small- to Medium-Sized Businesses Currently Use Cloud Hosting or Say They Plan To
The content below is taken from the original (Over 90% of Small- to Medium-Sized Businesses Currently Use Cloud Hosting or Say They Plan To), to continue reading please visit the site. Remember to respect the Author & Copyright.
Cloud hosting is an overwhelmingly popular option among small- to medium-sized businesses (SMBs), as shown by a new survey from Clutch , one of the… Read more at VMblog.com.
HP’s new Omen gaming PCs include a cube-shaped desktop
The content below is taken from the original (HP’s new Omen gaming PCs include a cube-shaped desktop), to continue reading please visit the site. Remember to respect the Author & Copyright.
Gamers have many reasons why they steer clear of desktops from big-name brands, but one of the biggest is the poor expansion. You may have fewer upgrade slots (if any) versus a white label or home-built rig, and you’ll frequently have to contend with non-standard parts. HP thinks it can make you reconsider, however. It’s refreshing its Omen gaming PCs once again, and the highlight is a completely new Omen X Desktop that promises both the perks of a major company’s industrial design with the expansion that you crave. That cube-on-its-side look is not only relatively unique in a sea of generic towers, but genuinely functional. Its three-chamber structure separates hot components while giving you room for expansion that includes dual graphics cards, four tool-free hard drive bays and an M.2 SSD. Also, this is an industry-standard chassis — HP will sell you the barebones case if you prefer to supply your own internals, and Maingear will even build its own beastly gaming PC around the box this year.
There’s one thing you won’t escape from major brand gaming PCs, though: the price. The Omen X Desktop will be available at HP’s website on August 17th for a starting price of $1,799, and that will get you an overclockable 4GHz Core i7, 8GB of RAM, Radeon RX 480 graphics, a 256GB SSD, a 2TB hard drive and a monstrous 1,300W power supply. That’s definitely not the most powerful system you could get for the money, and it’s going to get pricier if you want perks like a GeForce GTX 1080 or 16GB of RAM (the retail config due October 16th starts at $2,100). What you’re really paying for is that exotic shell. By itself, the case costs $600 — potentially worth it if you want the easy-access drives or a conversation piece, but overkill for most anyone else.
And don’t worry if you weren’t in the market for an over-the-top desk machine, as there’s more Omen hardware in the pipeline. An updated Omen 17 laptop now packs NVIDIA’s portable version of the GTX 1060 or GTX 1070 as well as a mini DisplayPort jack, making it friendly to both VR and dual external screens. It starts at $1,600. There’s also an Omen X Curved Display with support for NVIDIA’s extra-smooth G-Sync tech (due in early 2017 for an unknown price) and a range of SteelSeries accessories that include a customizable mouse ($60), a light-up keyboard ($100) and a headset ($80). All of the SteelSeries extras should arrive in mid-September.
Cherlynn Low contributed to this report.
Source: HP
The second cloud wave is upon us, ready or not
The content below is taken from the original (The second cloud wave is upon us, ready or not), to continue reading please visit the site. Remember to respect the Author & Copyright.
I often get asked about the next generation of cloud technology — and most people are disappointed with my answer. The next wave of cloud computing tech is not about a cloud at all; it’s about cloud governance and management.
We’ve spent the last 10 years building clouds and moving applications to them. In the United States, about 5 percent of IT workloads now run in the cloud, but that figure will double to 10 percent quickly — by 2018.
That means we’re approaching a tipping point for moving to cloud, where traditional management and governance approaches won’t work with the number of applications and data stores in the cloud. I think that tipping point is at about 10 percent.
Moreover, most enterprises are deploying not only one type and brand of cloud (a multicloud approach), which adds further complexity — so governance and management become even more important.
The governance and management technologies come in different forms, including API management, service governance, cloud management platforms, and monitoring. Some come from established providers, some from new providers. Some are part of mainstream clouds like those from Amazon Web Services and Microsoft, some are add-on technologies.
As the need for cloud governance and management becomes more widely understood by both IT and providers, we’ll see major cloud providers and system integrators start acquiring the smaller, more promising cloud management and governance providers. CSC’s acquisition of ServiceMesh a year ago is such an example.
We’ll also see new technologies emerge built specifically for the cloud. Enterprises should pay attention to the companies innovating in cloud governance and management — they could well be the secret sauce that let your cloud workloads scale as intended when so much is running in the cloud. In fact, I believe we’ll need new technologies to support a world where more than 10 percent of enterprise workloads have been migrated to the cloud.
Are you prepared for this second wave of the cloud? Most enterprises are not, partly because the technology landscape is in flux and partly because they’re still early in the cloud deployment game. But that second wave will be here very soon, whether you’re ready for it or not.
Leveraging OneNote at Your Company
The content below is taken from the original (Leveraging OneNote at Your Company), to continue reading please visit the site. Remember to respect the Author & Copyright.
Creating good information management systems can be difficult. Document creation has largely been solved with Word, Excel, and PowerPoint. Business communication has been solved with Outlook and Skype for Business. Yet, many businesses still do not have a solution for managing information which does not fit anywhere. Some examples of this kind of information would be issue tracking, light project management, ticket handling, product summaries, shared team progress and action items.
Some large companies understand the importance of keeping on top of issues, tickets, meeting notes, action items, and project management so they have purchased specialized software to manage each task. However, many businesses leave those jobs up to their employees to find their own solutions. This usually means everyone finds their own system that works for them and everyone has a different system. While this works fine when everyone works alone, it can create issues when working in teams.
Trusting everyone to manage their own projects and information can empower those who have the proper skills and burden those who do not. Time management should not be an afterthought and avoiding the problem does not make it go away. Microsoft does sell a few products that help with project management, like Microsoft Project. But for many companies those tools are too expensive and cumbersome.
OneNote to the rescue
OneNote is a free note-taking program that Microsoft offers to everyone at no charge, and it can be the perfect information management tool for your business. OneNote is extremely flexible and can be easily set up to give structure to complex systems. All companies have some sort of system to manage projects and their status. Using several Word and Excel documents to manage issues and store progress updates can work, but it is not a good solution. OneNote provides an infinite blank canvas to track progress and include all types of media from images, to tables, to documents, inking, audio, video, and more.
Instead of hiding project status within Excel documents stored in a complex file structure on a server, teams can expose information in an easy-to-understand format with OneNote. With standard formatting techniques, OneNote can give hard-to-find documents or data more visibility in fewer clicks. With computers taking over so much of our day-to-day work, we generate a huge amount of data. OneNote can help connect that data without moving or duplicating it.
Does your company store testing documents in one folder with test requests in a separate network drive, or something like that? Well, you can respect the current procedures and put a table in OneNote with links to test data next to test requests. This makes sense to your employees and coworkers who want the data, while also keeping your existing server file structure intact.
Work how you want to work
OneNote has several levels of organization that gives you freedom on how to set up your notebooks. Some use cases call for pages to be moved, some call for sections to be created and closed. To-do items can be set using simple tags or by using Outlook Tasks. The Outlook Tasks can be assigned to team members and due dates set.
Leveraging OneNote in your company does not have a clear procedure, but it does have huge potential. Paired with Outlook Groups, OneNote can take hard-to-reach information and give it to remote users. When your team is on board with using OneNote, it can become the first stop to answer critical questions like what open issues do we have, how did we solve this problem last time, or what is holding up progress.
Through an upcoming series of posts on leveraging OneNote in your company, I will give detailed examples on several different use cases. I will also highlight where OneNote is missing features and when that can be a pain. Topics of this series include project management, note tracking, issue tracking, ticket management, and summary pages.
The post Leveraging OneNote at Your Company appeared first on Petri.
500px’s Splash Lets You Search for Photos by Sketching Them
The content below is taken from the original (500px’s Splash Lets You Search for Photos by Sketching Them), to continue reading please visit the site. Remember to respect the Author & Copyright.
Photography enthusiast site 500px has a new tool that lets you search for photos by sketching them. Have a vague idea of what you want to see or a memorable picture that you haven’t been able to find? Just channel your inner Van Gogh.
Of course, drawing an image with your mouse on the virtual canvas won’t allow for much accuracy, but you can utilize a color palette you like to search for a scenic scene. The photos are primarily landscape photography and you can then buy them for your own use or even order prints. Color is really key here—you can try to meticulously draw a lighthouse, for example, but the tool extrapolates your broad strokes and color choices more than your intricate intentions. You also search for animals, portraits, travel, and city photos that match up with your palette, and the search results update with every stroke you add to your drawing.
Tiny robot caterpillar can move objects ten times its size
The content below is taken from the original (Tiny robot caterpillar can move objects ten times its size), to continue reading please visit the site. Remember to respect the Author & Copyright.
Soft robots aren’t easy to make, since they require a completely different set of components from their rigid counterparts. It’s even tougher to scale down the parts they typically use for locomotion. A team of researchers from the Faculty of Physics at the University of Warsaw, however, successfully created a 15-millimeter soft micromachine that only needs light to be able to move. The microrobot is made of Liquid Crystalline Elastomers (LCEs), smart materials that change shape when exposed to visible light. Under a light source, the machine’s body contracts like a caterpillar and forms waves to propel it forward.
The researchers said the robo-caterpillar can climb steep slopes, squeeze into minuscule spaces and move objects ten times its size. A tiny machine like this that can operate in challenging environments could be used for scientific research, and maybe even espionage if someone can find a way to attach a camera or a mic to it. But if the robot’s a bit too small for a specific application, researchers could also adopt the team’s method to make something a wee bit bigger.
Via: PopSci
Source: Faculty of Physics at the University of Warsaw, Advanced Optical Materials
How to Handle A Data Breach
The content below is taken from the original (How to Handle A Data Breach), to continue reading please visit the site. Remember to respect the Author & Copyright.
To a modern business a data breach can have devastating effects. We have seen TalkTalk hastily bungle, Sage coyly dawdle and it’s got to change. We don’t spend all day hunting these illusive beasts either, but we have had our involvement in both breaches and feel we could offer some public insight to the very elusive modern mishap. We urgently want change. Our motivations aren’t always… Read more →
Cisco joins Microsoft and flings out Skype-friendly collab app
The content below is taken from the original (Cisco joins Microsoft and flings out Skype-friendly collab app), to continue reading please visit the site. Remember to respect the Author & Copyright.
Cisco has partially put aside its rivalry with Microsoft and launched a new collaboration product that is compatible with Microsoft’s Skype for Business.
The Cisco Meeting Server allows customers to connect people in Cisco video rooms with others who are using Skype for Business. It was built using technology Cisco acquired from Acano, which it paid $700m (£532m) for earlier this year.
The company said the move represents “a huge leap forward”. In a statement it said: “We want to make connecting with others painless so you can get on with the task at hand and get great work done.”
It allows anyone to join a collaboration meeting regardless of whether they’re using competitors’ gear such as Avaya or Polycom, said Snorre Kjesbu, teamed up to launch a collaboration products and Microsoft’s Skype for Business.
Microsoft itself has been following a more collaborative approach with its former adversaries, under the leadership of Satya Nadella. Under Nadella, one of his first launches was to create an Office for iPad users.
Now it seems Cisco is to some extent following in Microsoft’s footsteps of being more open and collaborative with competitors.
Rowan Trollope, senior veep at Cisco, said: “Connecting should not be hard. But it has been, because certain vendors’ technologies have not played well with standards-based technologies, like Cisco’s industry-leading video systems.”
He continued: “We just fixed that, and the impact is huge. Just as you don’t think twice about whether an iPhone can call a Samsung Galaxy, enterprises need to know that everyone can join the meeting. And now they can.”
Andrew Heintz, manager of video engineering at US energy provider Exelon said: “Our users didn’t understand why they couldn’t connect Skype for Business and our Cisco video rooms. It didn’t make sense, and we needed to change that.
“Now they don’t even think about it—a meeting is a meeting is a meeting. If they are in the office, they usually join from a Cisco video room. If they are working from home, they join that same meeting and get the same experience—but they join from Skype for Business.”
Under Trollope, the collaboration technology group has refreshed the entire collaboration technology portfolio. He joined the biz in 2012, having previously led Symantec’s sales, marketing, and product development teams.
The news comes the same week as Cisco axed 5,500 staff – as the company pushes ahead with its plans to become primarily a software outfit. ®
Sponsored:
Accelerated Computing and the Democratization of Supercomputing
A First Look At PowerShell on Linux
The content below is taken from the original (A First Look At PowerShell on Linux), to continue reading please visit the site. Remember to respect the Author & Copyright.
If you had any doubts that the Microsoft of today is vastly different from the Microsoft you grew up with, I think the news out of Redmond today should put those doubts to rest. Microsoft has finally put all the speculation to rest and announced that PowerShell is now an open source project released under the MIT license. This is a monumental step for a company long known for it’s proprietary ways but for those of us in the PowerShell community it doesn’t come as a total surprise. PowerShell creator and Microsoft Technical Fellow Jeffrey Snover has never hidden his hopes of PowerShell open source. As you might imagine, for a company like Microsoft that is not a quick process.
But today everything changes.
Even though I’m talking about PowerShell going open source, this is possible because Microsoft has invested significant resources in developing a Core edition of the .NET Framework. The .NET Core Framework is also an open source project. If you are interested in learning more visit the DotNetFoundation.org site. I mention this because the open source version of PowerShell is based on the .NET Core which means (as it always has) if there are limitations or dependencies in .NET there will be limitations in PowerShell. But don’t take that as a criticism it is merely an observation.
So what do we get? Microsoft is announcing that you can now run an open source version of PowerShell on these platforms:
- Windows 8.1/Windows Server 2012 R2
- Windows 10/Windows Server 2016
- Ubuntu 14.01
- Ubuntu 16.04
- CentOS 7
- OS X 10.11
Yes, this means you can now run PowerShell on Mac and Linux! Personally, I think this is a major step in the right direction towards the goal of using PowerShell to manage everything, even non-Windows platforms or even from a non-Windows platform. Of course, we’re not 100% there yet as the bits available today are clearly alpha level. But every journey has to begin somewhere.
PowerShell on Linux
In hopes of getting you even more excited about this news allow me to share some shots of PowerShell running on a Ubuntu 16.04 desktop. This install process is pretty simple and when you are finished you can open a terminal session and run ‘powershell’.
You probably never thought you’d see that in a legitimate screen shot. It will be very important to check the $PSVersionTable variable to check your version.
PowerShell on Linux, especially this alpha, will be a very different experience than what you may be used to on Windows. We only have a subset of modules and commands.
Many commands that don’t make sense on Linux like Get-Service and Get-Eventlog have been removed. But others should behave just as you expect, including help.
When you run PowerShell commands in Linux you are working with objects which means you can create the same type of pipelined expressions as you would in Windows.
One major difference, at least for now, is that in a Linux PowerShell session, the Linux-style aliases we have in Windows like ls and ps, have been removed. This means that in a PowerShell session you can use the native command or the PowerShell equivalent.
What to do with Linux aliases is at least one area the PowerShell team will by relying on the community for guidance.
Limitations
I have not tested the Mac implementation but at least from the Linux side don’t expect everything to work as it does in Windows today. There are a number of areas that are still being addressed and I expect some of them will require additional open source implementations.
The most significant limitation, in my opinion, is that there is no remoting or remote computer capabilities. Microsoft is planning on an update to address this issue and most likely it will include support for OpenSSH as a native remoting protocol. Related to this issue, we won’t have any background or scheduled job capabilities either. And finally there is no support for WMI or CIM. The latter I think will eventually appear through other open source initiatives but for now there are no WMI or CIM commands that you can run on Linux but definitely keep an eye out in this space as the Operations Management Suite (OMS) continues to evolve.
I know that most of the time we tend to skip looking at README files, but take a few minutes to read the release notes and especially the Known Issues document.
What About Windows?
But what about all your Windows servers that you manage today from a Windows 10 desktop? Microsoft’s intention is to release supported versions of PowerShell based on the open source bits, but I’m assuming with all the features you’ve come to suspect. In other words, when Windows Server 2016 ships I expect it to offer the same full-on PowerShell experience you are familiar with.
Next Steps
First, if you haven’t already done so be sure to read the official announcement from Jeffrey Snover. Then head to http://bit.ly/2brNqaw and download the Alpha releases that interest you. Then you have to decide where you want to contribute and how you want to take part in this journey. Clearly what is being release today is but a small piece of the PowerShell ecosystem and Microsoft is relying on the community to help drive it forward.
We’ll have more coverage in the weeks to come about this exciting next step for PowerShell. For those of you who have invested in learning PowerShell I think it is about to pay off in a big way. And for the rest of you, it’s never to late to start learning. Let’s get going.
The post A First Look At PowerShell on Linux appeared first on Petri.
A new algorithm can hide messages in your favorite dance music
The content below is taken from the original (A new algorithm can hide messages in your favorite dance music), to continue reading please visit the site. Remember to respect the Author & Copyright.
It’s long been known that secret messages can be included in music through techniques such as backmasking, but now a Polish researcher has developed an entirely new approach. By subtly varying the tempo of a particular type of dance music, he’s managed to encode information in a way that’s completely inaudible to human listeners.
StegIbiza is an algorithm for hiding information in a type of dance music known as Ibiza, which originates on the island by the same name in the western Mediterranean Sea. Ibiza music is characterized by its trance-like beat, and that’s what Krzysztof Szczypiorski, a professor at Poland’s Warsaw University of Technology, made use of.
To create his approach, Szczypiorski began by developing a sort of Morse code by which the dots and dashes that would represent letters are converted instead into a faster or slower tempo for a particular beat.
To prove his concept, he used Apple’s Logic X Pro music production software to create covers of five popular songs: “Lily was here” by David A. Stewart and Candy Dulfer; “Miracle” by Queen; “Rhythm is a dancer” by Snap!; “So what” by Miles Davis; and “You were the heart’s beat” by Andrzej Zaucha.
The songs were arranged without vocals in techno, hip-hop, and trance styles using the instruments available in Apple’s software. From there, Szczypiorski embedded the message “steganography is a dancer!” in each song, placed randomly.
Szczypiorski varied the degree to which tempos were altered as part of his encoding technique to see when those changes became discernable to human ears. He then tested his approach in both a studio setting where participants wore headphones and an open-air setting where a DJ was in control. Across both, he found that tweaks of less than 1 percent didn’t get noticed at all.
That means that StegIbiza could be a viable means of encoding information in music, and Szczypiorski suggests that software could be created both to code and decode music this way, with potential applications in security.
His paper is now available online.
Airbus reveals ambitious plan for autonomous flying taxis
The content below is taken from the original (Airbus reveals ambitious plan for autonomous flying taxis), to continue reading please visit the site. Remember to respect the Author & Copyright.
If a self-flying taxi scheme didn’t come from the world’s second largest aeronautical company, we might think it was a prank. However, Airbus appears to be serious about its "Vahana" project, aimed at creating an autonomous passenger drone network, and thinks testing can begin as early as 2017. That sounds ambitious, to say the least, but "many of the technologies needed, such as batteries, motors and avionics are most the way there," according to Airbus engineer Rodin Lyasoff.
Users arriving at, say, an airport would book a seat on a so-called zenHop "CityAirbus" drone, then proceed to a "zenHub" helipad, according to the concept. They’d be flown to their destination for about the same cost as a taxi, since the ride would be shared by several passengers. Luggage would be delivered by another service (zenLuggage, of course), and the whole thing would be safeguarded from hackers by (wait for it) zenCyber.
The company said that the CityAirbus multi-rotor, electric aircraft design has been "kept under wraps," though it did supply an artist’s impression (above). The Airbus Helicopter subsidiary has been working on the drone-like design for two years, and it "could soon become reality without having to wait for too many regulatory changes," according to the press release.
Airbus is also working on a drone delivery service (below) and plans to start testing it at a Singapore university by mid-2017. The cargo-laden vehicles fly automated routes in "aerial corridors," then drop them off and send delivery notifications to customers. The goal is to "potentially increase acceptance for passenger flight testing, thus giving a boost to urban air vehicle projects," according to the company.
The idea of an electric passenger drone isn’t new, as we’ve seen a prototype from Chinese firm EHang and a manned flight test from Volocopter. However, Airbus, with 55,000 workers and thousands of engineers, has a far more realistic shot at making it feasible. "Our group’s strength is that we have interconnected projects that together are helping to drive the upcoming revolution," says developer Jörg Müller.
The company has already done a study and concluded that the idea has merit. It would first launch the passenger service with pilots, then proceed to autonomous aircraft once regulations and certain technologies, like see-and-avoid, fall into place. While it sounds like our dreams of a Jetsons-like urban utopia are finally falling into place, we’ll cling tightly to our skepticism until we see these taxis actually fly.
Source: Airbus
UK’s first cloud DVR lets you watch recordings anywhere
The content below is taken from the original (UK’s first cloud DVR lets you watch recordings anywhere), to continue reading please visit the site. Remember to respect the Author & Copyright.
To be blunt, Bush isn’t a brand known for particularly innovative products. But, come the end of the month, it’s the name you’ll see on the first cloud DVR to launch in the UK. Bush’s Digital TV Recorder is an affordable set-top box — arriving exclusively at Argos on August 30th for £100 — that lets you watch and record Freeview channels. What’s special about it, though, is the integration of ShowDrive, a service that takes those recordings and uploads them to the cloud so you can watch them wherever you want, and on basically any device.
The set-top box itself has two tuners (so you can watch one channel and record another simultaneously) and 16GB of internal storage. Add in an active ShowDrive subscription, however, and those 16 gigs become more of a local cache, with recordings subsequently uploaded to the cloud as fast as your internet connection can manage. Pay £2 per month, and you’ll get enough space for 35 hours of content, while £6 per month increases that to 350 hours. Discounts are also available if you spring for a year’s subscription upfront.
Once recordings have been uploaded to the ether, you can access them anywhere through the ShowDrive web app, which is compatible with computers and mobile devices. More than just a folder, the app presents your recordings in a visually rich UI with cover art, episode descriptions and the like. It’s searchable, offers recommendations, supports pause and resume across devices and will also point you towards on-demand platforms that feature more episodes of your favourite shows. If you’re on a shoddy hotel WiFi or slow 3G connection, then the quality of the stream will automatically scale to avoid any buffering downtime. Incidentally, the Bush box itself is said to upscale SD content to "near HD" when you’re watching TV at home.
We’re told other ShowDrive-ready set-top boxes, as well as TVs, will start popping up next year. For a while, though, Bush’s Digital TV Recorder will be the only DVR device of its kind available in the UK. That’s quite the claim considering the numerous pay-TV players we have over here, all with their own hardware. Many of them offer apps that let you stream TV and catch-up services on the move. Some also let you watch recordings on mobile devices over your home WiFi network, or download them to watch when you’re not. None of them, however, have yet to use cloud storage to allow you to access recordings from anywhere.
Source: ShowDrive
Intel’s Optane XPoint DIMMs pushed back – source
The content below is taken from the original (Intel’s Optane XPoint DIMMs pushed back – source), to continue reading please visit the site. Remember to respect the Author & Copyright.
In the week of Intel’s Developer Forum we have heard its Optane XPoint SSDs and DIMMs may be delayed.
We are hearing that the first version of XPoint chips can only be used for SSDs. A second, more developed version is needed to make XPoint for DIMMs. That means XPoint DIMMs may not appear until 2018, which would give competing non-volatile DIMMs more time within which to establish themselves. If true, this means Diablo and Netlist get more breathing space.
Intel’s Optane SSDs employ an Intel ASIC as part of their control function. We are informed that this ASIC’s design, which was started some time ago, made assumptions about the characteristics of the media which have not been borne out as the XPoint chips have been delivered from the IMTF Lehi plant.
Our source said this controller area problem has contributed to the XPoint SSD’s relatively poor performance advantage over NAND SSDs, a mere 10X performance boost instead of the potential 1,000 times better than NAND. This may delay Intel’s Optane XPoint SSD production availability until 2017, the source claimed.
An Intel spokesperson said: “3D XPoint technology is in production and Intel has early samples with customers. Intel’s CEO recently stated during the Q2 earnings call, Optane SSDs will start to ship at the end of this year with 3D XPoint DIMMs following next year. We have not shared details on 2nd generation 3D XPoint technology.” ®
Sponsored:
Global DDoS threat landscape report
Announcing Azure App Service MySQL in-app (preview)
The content below is taken from the original (Announcing Azure App Service MySQL in-app (preview)), to continue reading please visit the site. Remember to respect the Author & Copyright.
Today, we’re announcing a cool new feature (in preview) for Web developers using Azure App Service to create Web applications that use MySQL. MySQL in-app enables developers to run the MySQL server side-by-side with their Web application within the same environment, which makes it easier to develop and test PHP applications that use MySQL.
We’re also making it very easy to get started with this feature via the Azure portal. During the creation of your Web App, you’ll be able to select a “MySQL in-app (preview)” provider for your database, which will help provision the database.
We think this feature will be very welcome by Web developers who are looking to accelerate their testing because:
- It supports many PHP applications that use MySQL, such as WordPress, Joomla and Drupal
- It’s cost-effective since there’s no additional cost to use this feature and you only pay for the App Service plan (since resources are shared)
- The MySQL and Web processes are co-located in the same environment (hence the term in-app) which means storage is shared
- Includes support for slow query logging and general logging, which you’ll need to turn on as needed (this feature impacts performance, so you shouldn’t use it all the time)
Since this feature is in preview, and shares its resources with the Web application in the same App Service plan, MySQL in-app is not recommended for production applications. Please also keep in mind the following tips and limitations when using this feature:
- Check your storage limits and upgrade the web app pricing plan as needed to accommodate data for both MySQL and your web app. For storage and memory limits for your pricing tier, review the quota limitations for all App Service plans pricing tiers.
- Note you only get one MySQL database per web application. In a scenario where you have a deployment slot web app and a production web app, you will get one MySQL database for the deployment slot and one MySQL database for the production web app, if you decide to turn on this feature for each app. The database contents will not be synchronized, which makes it easier for you to try different schema versions and content.
- The auto scale feature is not supported since MySQL currently runs on a single instance. Similarly, enabling local cache is not supported.
- The MySQL database cannot be accessed remotely using the MySQL CLI or other tools that access external endpoints. You can only access your database content using PHPMyAdmin (which is bootstrapped upon provisioning) or using the KUDU debug console.
The team continues working with Web developers in improving their experience in Azure App Service, particularly when it comes to data solutions. Over the last few months, we’ve come a long way in our data solution portfolio for Web developers, including revamping our PHP client drivers for Azure SQL, a new version of the JDBC drivers, expanded support for Linux on our ODBC drivers, MongoDB protocol support in DocumentDB and, earlier this week, an early technical preview of the new PHP on Linux SQL Server drivers. We will continue working on more data solutions that make it easier for Web developers to bring great applications to market in Microsoft Azure, whatever the language, stack and platform.
If you’re using MySQL in-app for development and testing and you are interested in migrating this application to production, Azure offers many solutions, including:
- ClearDB database
- ClearDB Clusters
- Marketplace solutions for MySQL, MariaDB and other MySQL-compatible solutions from partners like Bitnami and MariaDB
- Community-contributed Azure Resource Manager (ARM) templates deploying on VMs
- MySQL on virtual machine on Linux or Windows OS
We hope you get started with MySQL in-app in Azure App Service today, and share with us your feedback. Don’t have a subscription? Sign up for a free trial! And if you’re interested in getting more details about this feature, make sure you check out the detailed blog post.
Alerting and monitoring for Azure Backup
The content below is taken from the original (Alerting and monitoring for Azure Backup), to continue reading please visit the site. Remember to respect the Author & Copyright.
We are excited to announce the preview release of alerting and monitoring for Azure backups, which is currently the top-voted idea on Azure backup UserVoice. In a continuation of the simplified experience using the new Recovery Services vault, customers can now monitor cloud backups for their on-premises servers and Azure IaaS virtual machines in a single dashboard. In addition, they can also configure email notifications for all backup alerts.
Enroll your subscription for the preview release:
# Step 1: Login to your Azure account from Windows PowerShell. Learn more on how to install Azure PowerShell.
PS C:> Login-AzureRmAccount
#Step 2: Select the subscription which you want to register for preview
PS C:> Get-AzureRmSubscription –SubscriptionName "Subscription Name" | Select-AzureRmSubscription
#Step 3: Register this subscription for alerting preview
PS C:> Register-AzureRmProviderFeature -FeatureName MABAlertingFeature –ProviderNamespace Microsoft.RecoveryServices
Introducing Recovery Services Vault
Introducing Alerting & Monitoring
If you are an exiting Azure backup customer using recovery services vault, update to the latest azure backup agent to use this feature. If you have configured email notifications before enrolling, turn off email notifications, enroll the subscription, and then configure notifications.
Related links and additional content:
- If you are new to Azure Backup, start configuring the backup on Azure portal
- Want more details? Check out Azure Backup documentation
- Need help? Reach out to the Azure Backup forum for support.
Advancing enterprise database workloads on Google Cloud Platform
The content below is taken from the original (Advancing enterprise database workloads on Google Cloud Platform), to continue reading please visit the site. Remember to respect the Author & Copyright.
Posted by Dominic Preuss, Lead Product Manager for Storage and Databases
We are committed to making Google Cloud Platform the best public cloud for your database workloads. From our managed database services to self-managed versions of your favorite relational or NoSQL database, we want enterprises with databases of all sizes and types to experience the best price-performance with the least amount of friction.
Today, we’re excited to announce that all of our database storage products are generally available and covered by corresponding Service Level Agreements (SLAs). We’re also releasing new performance and security support for Google Compute Engine. Whether you’re running a WordPress application with a Cloud SQL backend or building a petabyte-scale monitoring system, Cloud Platform is secure, reliable and able to store databases of all types.
Cloud SQL, Cloud Bigtable and Cloud Datastore are now generally available
Cloud SQL Second Generation, our fully-managed database service offering easy-to-use MySQL instances, has completed a successful beta and is now generally available. Since beta, we’ve added a number of enterprise features such as support for MySQL 5.7, point-in-time-recovery (PITR), automatic storage re-sizing and setting up failover replicas with a single click.
Performance is key to enterprise database workloads, and Cloud SQL is delivering industry-leading throughput.
Cloud Bigtable is our scalable, fully-managed NoSQL wide-column database service with Apache HBase client compatibility, and is now generally available. Since beta, many of our customers such as Spotify, Energyworx and FIS (formerly Sungard) have built scalable applications on top of Cloud Bigtable for workloads such as monitoring, financial and geospatial data analysis.
Cloud Datastore, our scalable, fully-managed NoSQL document database serves 15 trillion requests a month, and its v1 API for applications outside of Google App Engine has reached general availability. The Cloud Datastore SLA of 99.95% monthly uptime demonstrates high confidence in the scalability and availability of this cross-region, replicated service for your toughest web and mobile workloads. Customers like Snapchat, Workiva and Khan Academy have built amazing web and mobile applications with Cloud Datastore.
Improved performance, security and platform support for databases
For enterprises looking to manage their own databases on Google Compute Engine (GCE), we’re also offering the following improvements:
- Microsoft SQL Server images available on Google Compute Engine – Our top enterprise customers emphasize the importance of continuity for their mission-critical applications. The unique strengths of Google Compute Engine make it the best environment to run Microsoft SQL Server featuring images with built-in licenses (in beta), as well as the ability to bring your existing application licenses. Stay tuned for a post covering the details of running SQL Server and other key Windows workloads on Google Cloud Platform.
- Increased IOPS for Persistent Disk volumes – Database workloads are dependent on great block storage performance, so we’re increasing the maximum read and write IOPS for SSD-backed Persistent Disk volumes from 15,000 to 25,000 at no additional cost, servicing the needs of the most demanding databases. This continues Google’s history of delivering greater price-performance over time with no action on the part of our customers.
- Custom encryption for Google Cloud Storage – When you need to store your database backups, you now have the added option of using customer-supplied encryption keys (CSEK). This feature allows Cloud Storage to be a zero-knowledge system without access to the keys and is now generally available.
- Low-latency for Google Cloud Storage Nearline storage – If you want a cost-effective way to store your database backups, Google Cloud Storage Nearline offers object storage at costs less than tape. Prior to today, retrieving data from Nearline incurred 3 to 5 seconds of latency per object. We’ve been continuously improving Nearline performance, and now it enables access times and throughput similar to Standard class objects. These faster access times and throughput give customers the ability to leverage big data tools such as Google BigQuery to run federated queries across your stored data.
Today marks a major milestone in our tremendous momentum and commitment to making Google Cloud Platform the best public cloud for your enterprise database workloads. We look forward to the journey ahead and helping enterprises of all sizes be successful with Cloud Platform.
Advancing enterprise database workloads on Google Cloud Platform
The content below is taken from the original (Advancing enterprise database workloads on Google Cloud Platform), to continue reading please visit the site. Remember to respect the Author & Copyright.
Posted by Dominic Preuss, Lead Product Manager for Storage and Databases
We are committed to making Google Cloud Platform the best public cloud for your database workloads. From our managed database services to self-managed versions of your favorite relational or NoSQL database, we want enterprises with databases of all sizes and types to experience the best price-performance with the least amount of friction.
Today, we’re excited to announce that all of our database storage products are generally available and covered by corresponding Service Level Agreements (SLAs). We’re also releasing new performance and security support for Google Compute Engine. Whether you’re running a WordPress application with a Cloud SQL backend or building a petabyte-scale monitoring system, Cloud Platform is secure, reliable and able to store databases of all types.
Cloud SQL, Cloud Bigtable and Cloud Datastore are now generally available
Cloud SQL Second Generation, our fully-managed database service offering easy-to-use MySQL instances, has completed a successful beta and is now generally available. Since beta, we’ve added a number of enterprise features such as support for MySQL 5.7, point-in-time-recovery (PITR), automatic storage re-sizing and setting up failover replicas with a single click.
Performance is key to enterprise database workloads, and Cloud SQL is delivering industry-leading throughput.
Cloud Bigtable is our scalable, fully-managed NoSQL wide-column database service with Apache HBase client compatibility, and is now generally available. Since beta, many of our customers such as Spotify, Energyworx and FIS (formerly Sungard) have built scalable applications on top of Cloud Bigtable for workloads such as monitoring, financial and geospatial data analysis.
Cloud Datastore, our scalable, fully-managed NoSQL document database serves 15 trillion requests a month, and its v1 API for applications outside of Google App Engine has reached general availability. The Cloud Datastore SLA of 99.95% monthly uptime demonstrates high confidence in the scalability and availability of this cross-region, replicated service for your toughest web and mobile workloads. Customers like Snapchat, Workiva and Khan Academy have built amazing web and mobile applications with Cloud Datastore.
Improved performance, security and platform support for databases
For enterprises looking to manage their own databases on Google Compute Engine (GCE), we’re also offering the following improvements:
- Microsoft SQL Server images available on Google Compute Engine – Our top enterprise customers emphasize the importance of continuity for their mission-critical applications. The unique strengths of Google Compute Engine make it the best environment to run Microsoft SQL Server featuring images with built-in licenses (in beta), as well as the ability to bring your existing application licenses. Stay tuned for a post covering the details of running SQL Server and other key Windows workloads on Google Cloud Platform.
- Increased IOPS for Persistent Disk volumes – Database workloads are dependent on great block storage performance, so we’re increasing the maximum read and write IOPS for SSD-backed Persistent Disk volumes from 15,000 to 25,000 at no additional cost, servicing the needs of the most demanding databases. This continues Google’s history of delivering greater price-performance over time with no action on the part of our customers.
- Custom encryption for Google Cloud Storage – When you need to store your database backups, you now have the added option of using customer-supplied encryption keys (CSEK). This feature allows Cloud Storage to be a zero-knowledge system without access to the keys and is now generally available.
- Low-latency for Google Cloud Storage Nearline storage – If you want a cost-effective way to store your database backups, Google Cloud Storage Nearline offers object storage at costs less than tape. Prior to today, retrieving data from Nearline incurred 3 to 5 seconds of latency per object. We’ve been continuously improving Nearline performance, and now it enables access times and throughput similar to Standard class objects. These faster access times and throughput give customers the ability to leverage big data tools such as Google BigQuery to run federated queries across your stored data.
Today marks a major milestone in our tremendous momentum and commitment to making Google Cloud Platform the best public cloud for your enterprise database workloads. We look forward to the journey ahead and helping enterprises of all sizes be successful with Cloud Platform.
ZeroStack Expands Hyperconverged Platform Partnerships with Dell and HPE
The content below is taken from the original (ZeroStack Expands Hyperconverged Platform Partnerships with Dell and HPE), to continue reading please visit the site. Remember to respect the Author & Copyright.
ZeroStack , a one-stop shop for hybrid cloud infrastructure that delivers agility without complexity, today announced that it is making its Z-Fabric… Read more at VMblog.com.