How to check Bluetooth version in Windows 10

The content below is taken from the original ( How to check Bluetooth version in Windows 10), to continue reading please visit the site. Remember to respect the Author & Copyright.

Bluetooth is one of the most common method used to transfer the files between a mobile device and the computer, but many times the version of Bluetooth is not supportive which creates issues in connecting and transferring the files. While […]

This post How to check Bluetooth version in Windows 10 is from TheWindowsClub.com.

Brit drone biz Sensat notches up 29km remote-control flight

The content below is taken from the original ( Brit drone biz Sensat notches up 29km remote-control flight), to continue reading please visit the site. Remember to respect the Author & Copyright.

Beyond visual line-of-sight exercise paves way for Amazon-style deliveries

A Brit drone firm has made a 29km beyond visual line-of-sight (BVLOS) flight, a small but important step for fuller commercialisation of the tech.…

This box sucks pure water out of dry desert air

The content below is taken from the original ( This box sucks pure water out of dry desert air), to continue reading please visit the site. Remember to respect the Author & Copyright.

For many of us, clean, drinkable water comes right out of the tap. But for billions it’s not that simple, and all over the world researchers are looking into ways to fix that. Today brings work from Berkeley, where a team is working on a water-harvesting apparatus that requires no power and can produce water even in the dry air of the desert. Hey, if a cactus can do it, why can’t we?

While there are numerous methods for collecting water from the air, many require power or parts that need to be replaced; replaced, what professor Omar Yaghi has developed needs neither.

The secret isn’t some clever solar concentrator or low-friction fan — it’s all about the materials. Yaghi is a chemist, and has created what’s called a metal-organic framework, or MOF, that’s eager both to absorb and release water.

It’s essentially a powder made of tiny crystals in which water molecules get caught as the temperature decreases. Then, when the temperature increases again, the water is released into the air again.

Yaghi demonstrated the process on a small scale last year, but now he and his team have published the results of a larger field test producing real-world amounts of water.

They put together a box about two feet per side with a layer of MOF on top that sits exposed to the air. Every night the temperature drops and the humidity rises, and water is trapped inside the MOF; in the morning, the sun’s heat drives the water from the powder, and it condenses on the box’s sides, kept cool by a sort of hat. The result of a night’s work: 3 ounces of water per pound of MOF used.

That’s not much more than a few sips, but improvements are already on the way. Currently the MOF uses zicronium, but an aluminum-based MOF, already being tested in the lab, will cost 99 percent less and produce twice as much water.

With the new powder and a handful of boxes, a person’s drinking needs are met without using any power or consumable material. Add a mechanism that harvests and stores the water and you’ve got yourself an off-grid potable water solution. solution going.

“There is nothing like this,” Yaghi explained in a Berkeley news release. “It operates at ambient temperature with ambient sunlight, and with no additional energy input you can collect water in the desert. The aluminum MOF is making this practical for water production, because it is cheap.”

He says that there are already commercial products in development. More tests, with mechanical improvements and including the new MOF, are planned for the hottest months of the summer.

Olafur Eliasson and IKEA are designing a series of affordable, solar-powered tools

The content below is taken from the original ( Olafur Eliasson and IKEA are designing a series of affordable, solar-powered tools), to continue reading please visit the site. Remember to respect the Author & Copyright.

The Danish-Icelandic artist Olafur Eliasson and the Swedish furniture retailer Ikea are teaming up to produce a range of accessible and affordable tools that function without mains electricity and use renewable energy.

Eliasson’s Little Sun project, co-founded with Frederik Ottesen, created a solar-powered torch for the 1 billion people worldwide who live off the power grid. It is now sold cheaply in more than 600 African outlets. […] The products developed will be available in Ikea stores.

Olafur Eliasson is a busy man these days: after recently completing his first building, he’s now announced a major design collaboration between his Little Sun initiative with IKEA to create a line of sustainable and affordable off-the-grid tools. 

Little Sun Original in Burundi. Photo: Aminata Nimaga.

“Little Sun makes solar energy tangible and your world a little bit more sustainable,” Olafur said. “We are excited to collaborate with IKEA, raising awareness for energy access and the need for global togetherness. Together, we want to connect the world by sharing the power of the sun with everyone.”

Southampton gathering, June 12th

The content below is taken from the original ( Southampton gathering, June 12th), to continue reading please visit the site. Remember to respect the Author & Copyright.

But don’t panic – it’s a gathering of geeks, not immortals with a bent for beheading! There can be only one place to be if you’re a RISC OS user who happens to be in or around Southampton on the evening of June 12th: Itchen College, which is when and where the next meeting of […]

Estonia is the first country to offer free public transit nationwide

The content below is taken from the original ( Estonia is the first country to offer free public transit nationwide), to continue reading please visit the site. Remember to respect the Author & Copyright.

Tallinn, known for its digital government and successful tech startups, is often referred to as Europe’s innovation capital. Now celebrating five years of free public transport for all citizens, the government is planning to make Estonia the first free public transport nation.

Pop-Up City‘s Regina Schröter interviews the Head of the Tallinn European Union Office, Allan Alaküla, about Estonia‘s plans to expand the successful fare-free public transport model from the capital to the entire country on July 1: “Before introducing free public transport, the city center was crammed with cars. This situation has improved — also because we raised parking fees. When non-Tallinners leave their cars in a park-and-ride and check in to public transport on the same day, they can’t only use public transport for free but also won’t be charged the parking fee. We noticed that people didn’t complain about high parking fees once we offered them a good alternative.”

A number of other European cities, most notably Paris, also offer, or are considering, free public transport to ease traffic, reduce pollution, and boost local businesses.

Previously: Can offering free rides invigorate public transit?

Azure Stack for Born in the Cloud Providers

The content below is taken from the original ( Azure Stack for Born in the Cloud Providers), to continue reading please visit the site. Remember to respect the Author & Copyright.

When Azure Stack was launched, it was with the promise of bringing innovation at the speed of the Public Cloud to on-premises environments, and boy is it delivering. In the last few weeks alone, announcements have included:

Red Hat and Microsoft co-developing an OpenShift managed service for Azure and Azure Stack.

Pivotal announced the general availability of Pivotal Cloud Foundry on Azure Stack

CommVault announced Azure Stack support

F5 announced Azure Stack support

Service Fabric Clusters came to Azure Stack

Azure Stack 1804 Update was released, bringing features like:

  • Visual Studio support for Azure Stack in disconnected scenarios using ADFS
  • Av2 and F Series Virtual Machines
  • Update to the SCOM Management Pack
  • New Azure Stack PowerShell version

Azure Stack App Service Update 2 was released, bringing features like:

  • Auto swap of deployment slots
  • Testing in Production
  • Azure Functions Proxies
  • Updates to .Net Core support, NodeJS, and NPM versions
  • Secret and Certificate rotation for admins

… and more besides.

Azure Stack is here to stay, and it’s growing and maturing much more rapidly than any on-premises solution ever has before. As ever though, the goal of Azure Stack is not to replace Azure, it’s to extend it to wherever Azure cannot go.

I talk to a lot of companies who specialise in delivering services from Azure, who are almost unanimously of the opinion than Azure Stack is there to enable hosting service providers to compete with Azure, or that they’ll never need to touch Azure Stack because they have Azure and it’s ‘better’.

To very quickly recap where I see Azure Stack actively being used today:

  • Disconnected, low bandwidth, or high latency environments where access to Public Cloud is difficult or impossible.
  • Highly regulated environments where data cannot leave an organisation’s premises, or cannot be managed by a company headquartered in the US.
  • Where Azure Services are wanted, but the services cannot rely on connectivity outside of a building.
    • e.g. a hospital may not trust connectivity to Azure with patients’ lives, but may make use of Azure Stack for some application purposes and Azure for others, with a common application and management model across platforms.
  • Where a system of record on premises cannot move to Azure, but application innovation around it is desired.
    • e.g. a large mainframe, or a heritage system that cannot move but still stores masses of valuable data.
    • Azure Stack can deliver Azure innovation to that data within existing four walls.
  • As a stepping stone to Azure. Sometimes moving to Azure doesn’t make sense until all applications or application components can be moved, so Azure Stack is used as an interim modernisation step within a datacentre. Once all modernisation activities are completed, the workload is moved to Azure.
    • Typically in this scenario, an Azure Stack is not purchased, instead a Service Provider’s multi-tenanted Azure Stack is used in conjunction with co-location services.

All of these are scenarios that exist today and which are causing companies very real pain, but the reality is that the born in the cloud service providers aren’t necessarily in a position to help with them. Most born in the cloud providers have no datacentre facilities, and no large capex budget to buy into Azure Stack themselves, so most of them gloss over customers with these problems and focus on those who they can move straight to Azure today.

The Azure Stack Early Adopter Initiative was originally called ‘The Azure Stack Early Adopter Initiative for Service Providers’ for very good reason. Azure Stack needs high quality datacentre facilities to operate within, in areas where Azure does not, so it’s the traditional datacentre hosting industry that’s best placed to deliver on its promise. I firmly believe that those service providers who invested into Azure Stack in its infancy have now sewn up those regions into where they’ve deployed. There isn’t a need for multiple Azure Stack providers in a small region, so those who have held off ‘until it’s ready’ have simply allowed their competitors to gain a probably insurmountable lead in knowledge, market share, and mindshare.

In the UK, if you are an Azure services provider and any of the above scenarios resonate with you and your customers but you can’t deliver Azure Stack yourself, I wholeheartedly encourage you to get in touch to see how we can help. At Pulsant we’ve developed a fully rounded partner programme around Azure Stack, both for multi-tenanted purposes in our datacentres and for single-tenanted purposes either in our DCs or on customer premises.

451 Research published research earlier this year indicating that 48% of large and multi-national enterprises intend to will use Azure Stack services either to complement existing on-premises infrastructure, or as a stepping stone to public cloud. Microsoft’s strategy revolves around the Intelligent Cloud and the Intelligent Edge, with Azure Stack being the major enabler for edge scenarios. Working with Pulsant, born in the cloud Azure providers no longer have to leave the money that those opportunities represent on the table.

If you want to chat more around the hybrid cloud opportunity, then drop me a line on Twitter @KennyLowe, or via email at firstname.lastname@pulsant.com.

The post Azure Stack for Born in the Cloud Providers appeared first on Cloud and Datacenter Thoughts and Tips.

It’s RoboCop-ter: Boffins build drone to pinpoint brutal thugs in crowds

The content below is taken from the original ( It’s RoboCop-ter: Boffins build drone to pinpoint brutal thugs in crowds), to continue reading please visit the site. Remember to respect the Author & Copyright.

‘Violent behavior’ identified and highlighted by surveillance system destined for a police force near you

Video A drone surveillance system capable of highlighting “violent individuals” in a crowd in real time has been built by eggheads.…

6 years on, IPv4 still dominates IPv6

The content below is taken from the original ( 6 years on, IPv4 still dominates IPv6), to continue reading please visit the site. Remember to respect the Author & Copyright.

IPv6, the modern version of the Internet Protocol framework that underlies just about everything on the network, is seeing steady uptake among service providers, but still hasn’t pushed its predecessor, IPv4, into obsolescence, according to a report released today by the Internet Society.

There are 24 countries in the world where IPv6 totals more than 15% of overall IP traffic, and 49 that have topped the 5% threshold. Yet the Internet Society – a non-profit that works to promulgate internet standards and lobby for open access to the internet – describes the technology as having moved from the “early adoption” development stage to the “early majority” phase.

To read this article in full, please click here

Firefox has a new side-by-side tab feature for multitaskers

The content below is taken from the original ( Firefox has a new side-by-side tab feature for multitaskers), to continue reading please visit the site. Remember to respect the Author & Copyright.

Firefox is jazzing things up with a couple of new test features that should embolden multitaskers and those who like to tinker with aesthetics. Side View lets you view a pair of tabs side-by-side without needing to open a new browser window. Once you…

Preparing for the Hybrid Cloud: A Checklist

The content below is taken from the original ( Preparing for the Hybrid Cloud: A Checklist), to continue reading please visit the site. Remember to respect the Author & Copyright.

Preparing for the Hybrid Cloud? Here’s What You Need To Do. Hybrid Cloud is a computing method which uses a mixture of on-premises, Private Cloud as… Read more at VMblog.com.

How I choose which services to use in Azure

The content below is taken from the original ( How I choose which services to use in Azure), to continue reading please visit the site. Remember to respect the Author & Copyright.

This blog post was co-authored by Barry Luijbregts, Azure MVP.

Last year, I attended a Pluralsight webinar hosted by Azure MVP and Pluralsight author, Barry Luijbregts, called Keep your dev team productive with the right Azure service. It was a fantastic webinar and I really enjoyed learning Barry’s thought process on how he selects which Azure services and capabilities to use for his own projects, and when he consults for his clients. Recently, I asked Barry to share his process in this blog post and on an episode of Azure Friday with Scott Hanselman (included below).

Microsoft Azure is huge and changes fast! I’m impressed by the services and capabilities offered in Azure and by how quickly Microsoft releases new services and features. It can be overwhelming. There is so much out there — and the list continues to grow — it is sometimes hard to know which services to use for a given scenario.

I create Azure solutions for my customers, and I have a method that I use to help me pick the right services. This method helps me narrow down the services to choose from and pick the right ones for my solution. It helps me decide how to implement high-level requirements such as “Running my application in Azure” or “Storing data for my application in Azure.” Of course, these are just examples. There are many other categories to address when I’m architecting an Azure solution.

A look at the process

Let me show you the process that I use for “Running my application in Azure.”

First, I try to answer several high-level questions, which in this case would be:

  1. How much control do I need?
  2. Where do I need my app to run?
  3. What usage model do I need?

Once I’ve answered these questions, I’ve narrowed down the services from which to choose. And then, I look deeper into the services to see which one best matches the requirements of my application, including functionality as well as availability, performance, and costs.

Let’s go through the first part of the process where I answer the high-level questions about the category.

Question 1: How much control do I need?

In considering how much control I need, I try to figure out the degree of control I need over the operating system, load balancers, infrastructure, application scaling, and so on. This decides the category of services that I will be selecting from.

On the control side of the spectrum is infrastructure-as-a-service (IaaS) category, which includes services like Azure Virtual Machines and Azure Container Instances. These give a lot of control over the operating system and the infrastructure that runs your application. But with control comes great responsibility. For instance, if you control the operating system, you are responsible to update it and make sure that it is secure.

Illustration showing the tradeoffs in going from IaaS, to Paas, to LaaS, and SaaS.

Figure 1. How much control do I need?

Further up the stack are services that fall into the platform-as-a-service (PaaS) category, which contains services like Azure App Service Web Apps. In PaaS, you don’t have control over the operating system that your application runs on, nor are you responsible for it. You do have control over scaling your application and your application configuration (e.g., the version of .NET you want your application to run on.

The next abstraction level is what I will here call logic as a service (LaaS), also known as serverless. This category contains services like Azure Functions and Azure Logic Apps. Here, Azure takes care of the underlying infrastructure for you, including scaling your app. Logic as a service gives little control over the infrastructure that your application runs on, which means that you are only responsible for creating the application itself and configuring its application settings, like connection strings to databases.

The highest level of abstraction is software as a service (SaaS), which offers the least amount of control and the most amount of time that you can focus on working on business value. An example of Azure SaaS is Azure Cognitive Services, which are APIs that you just call from your application. Somebody else owns their application code and infrastructure; you just consume them. And all you manage is basic application configuration, like managing the API keys that you use to call the service.

Once I know how much control I need, I can pick the category of services in Azure and narrow down my choice.

Question 2: Where do I need my app to run?

The second question stemming from “Running my application in Azure” is: Where do I need my application to run?

Illustration contrasting running apps in Azure and somewhere else

Figure 2. Where do I need my app to run?

You might think that the answer would be: I need to run my application in Azure. But the answer may not be that simple. For example, maybe I do want parts of my application to run in the Azure public cloud but I want to run other parts in Azure Government or the Azure China cloud or even on-premises using Azure Stack.

Or it could be that I want to be able run my application in Azure and on-premises (if rules and regulations change), on my local development computer, or even in public clouds from other vendors.

This question boils down to how vendor-agnostic I’d like to be and where to store my data.

Once I’ve answered this question, I can narrow down the choice of Azure services even further.

Question 3: What usage model do I need?

How my app will be used guides me to the answer to the third and final question: what usage model do I need?

Decision tree branching between apps used all the time (pay per month) and occasionally (pay per execution)

Figure 3. What usage model do I need?

Some applications are in use all the time, like a website. If that is the case for my application, I need to look for services that run on what I call the classic model. This means that they are always running and that you pay for them all month.

Other applications are only in use occasionally, like a background job that runs once every hour, or an API that handles order cancellations (called a few times a day). If my application runs occasionally, I need to select a service from the logic-as-a-service (or serverless) category. These services only run when you need them, and you only pay for them when they run.

After I’ve answered the high-level questions for this category, I can narrow down the services that I can choose from to just a couple or even one.

The next step: Match service functionality to my application requirements

Now, I need to figure out which service fulfills the most requirements for my application. I do so by looking at the functionality that I need, the availability that the service offers in its service level agreement, the performance that it provides, and what it costs.

Finally, this leads me to the service that I want to use to run my application. Often, I use multiple services to run my application, because it consists of many pieces, like a user interface and APIs. So, it might turn out that I run my web application in an Azure App Service Web App and my APIs in Azure Functions.

Other categories of services

We’ve only discussed one category of requirements: “Running my application in Azure.” There are many more, like “Securing my application in Azure,” “Doing data analytics in Azure,” and “Monitoring my application in Azure.” The method I have described will help you with all these categories.

In a recent Azure Friday episode, How I choose which services to use in Azure, I spoke with Scott Hanselman about my process for deciding which services to use in Azure. In it, we discuss how I choose services based on “Running my application in Azure” and on “Storing my data in Azure.”

I hope that my method helps you to sort through and choose the best services for your application from the vast array of what’s available in Azure. Please don’t hesitate to reach out to me through Twitter (@AzureBarry) if you have any questions or feedback.

uefitool (0.25.0)

The content below is taken from the original ( uefitool (0.25.0)), to continue reading please visit the site. Remember to respect the Author & Copyright.

UEFITool is a cross-platform C++/Qt program for parsing, extracting and modifying UEFI firmware images

Deploying a Turnkey Raspberry Pi System

The content below is taken from the original ( Deploying a Turnkey Raspberry Pi System), to continue reading please visit the site. Remember to respect the Author & Copyright.

If you only do projects for yourself, you are spoiled. After all, you know your environment better than anyone. You know what power you’ll have, the temperature range, and how your network is configured. This last part is especially problematic if you are trying to deploy something that connects to a wireless LAN. How can you configure, say, a Raspberry Pi so that it can connect to an unknown user’s WiFi network? Fixing that problem is the goal of [schollz’s] Raspberry Pi Turnkey project.

The idea is simple. A Raspberry Pi image boots up for the first time and offers a WiFi hotspot itself called ConnectToConnect. The WiFi password is also ConnectToConnect. Once connected, you get configuration options that allow you to tailor the system to your network. Sure, you could have people log in and reconfigure via a serial terminal, wired ethernet (which isn’t always set up right, either), or a USB keyboard But that’s not a great out-of-the-box experience for most customers.

When the WiFi credentials are entered into the login form at address 192.168.4.1, the Pi will modify its configuration and then reboot itself. The startup script makes sure the connection was successful. If the credentials are not correct, then the Pi will reboot back into the AP mode to allow you to re-enter them again.

This isn’t a fast process, so you should probably try to get it right the first time. You can build your image from the instructions, or you can just download a ready-to-go image that you can then customize.

We couldn’t help but think about the same kind of system we’ve seen for the ESP8266. We also wondered if the simultaneous AP and client code we’ve seen for the Pi Zero could help [schollz] reduce the rebooting required, but we aren’t sure about that.

Images: Gareth Halfacree [CC BY-SA 2.0] and [CC BY-SA 2.0].

Lego is basically building AR ‘Sims’ for its playsets

The content below is taken from the original ( Lego is basically building AR ‘Sims’ for its playsets), to continue reading please visit the site. Remember to respect the Author & Copyright.

Lego is infusing its bricks with digital magic in a series of new augmented reality experiences using Apple’s updated ARKit 2. The Lego AR experiences, due out later this year, combine real-world Lego buildings with digital landscapes. Build a physic…

Raspberry Pi’s Power Over Ethernet Hardware Sparks False Spying Hubbub

The content below is taken from the original ( Raspberry Pi’s Power Over Ethernet Hardware Sparks False Spying Hubbub), to continue reading please visit the site. Remember to respect the Author & Copyright.

Have you ever torn open an Ethernet jack? We’d bet the vast majority of readers — even the ones elbow-deep into the hardware world — will answer no. So we applaud the effort in this one, but the conclusion landed way off the mark.

In the last few days, a Tweet showing a Raspberry Pi with its Ethernet socket broken open suggested the little PCB inside it is a hidden bug. With more going on inside than one might expect, the conclusion of the person doing the teardown was that the Raspberry Pi foundation are spying upon us through our Ethernet traffic. That’s just not the case. But we’re still excited about what was found.

The truth is rather more obvious to anyone who has spent a lot of time working with Ethernet interfaces. To confirm our suspicions we had a chat with Roger Thornton, the Principal Hardware Engineer at Raspberry Pi Trading, and the man with his finger on the pulse of what goes into a Pi. He was very happy to confirm  in a fascinating conversation about sourcing Ethernet jacks with integrated magnetics, that of course the Pi contains no surveillance hardware. Instead what our Tweeter had found are the magnetics, the isolation transformer, and some filter components included because the latest Raspberry Pi version (Raspberry Pi 3 Model B+) has support for a future Power Over Ethernet (PoE) add-on.

It is these filter parts on their little PCB that seem to have captured attention as possible nefarious parts, but in debunking the whole idea it’s worth taking a look at the magnetics themselves because they are an interesting and above all inexpensive part that has some uses outside the world of Ethernet.

What Goes Into an Ethernet Jack Anyway?

The schematic of the Raspberry Pi 3 B+ ethernet magnetics.
The schematic of the Raspberry Pi 3 B+ ethernet magnetics.

[Roger] was kind enough to give us the schematic of the jack used in the Pi, though it is typical of such jacks and bears very little difference from that you’d find in any number of other nearly identical components. At its centre are a row of four transformers, which serve to isolate the Pi from whatever stray voltages may be present upon the Ethernet cabling. To the left of each transformer is a trifilar choke to cancel out any common-mode noise that may have arrived via the cable. At the bottom is a set of RC filters on the PoE power lines which were probably the components that ignited the controversy.

The transformers have a bandwidth from about 250 kHz to 100 MHz which allows them to ignore the low frequency 48 VAC of the PoE, and pass through the Ethernet signals. In the case of the Pi 3 B+, the PoE lines are taken to a header on the board, which will mate with an upcoming PoE HAT that will contain a switching PSU with its own 1.5 kV isolation transformer to preserve the barrier between the Pi and the line.

An interesting aside to the conversation came in that some companies wind their own transformers while others buy ready-made ones. An easy way to spot the in-house ones it seems is that they will usually be concealed under some black sealant as is the case with the one in the Tweet, while the bought-in ones will be standalone potted components.

Hacking Ethernet Components

A row of Ethernet transformers in what looks like a switch. By Hans Haase [CC BY-SA 4.0].

So with [Roger]’s help we’ve established that the Pi contains no nefarious components (which of course we knew anyway). But there remains the discovery that every Ethernet card ever made has at least one transformer with a very usable RF bandwidth, so these are components very much worth a second look if you have an interest in radio. If you have ever needed a small RF transformer for inter-stage coupling you will be aware that these are not cheap components, so the thought that there is a ready supply that you can either lift from an old Ethernet-connected device (or buy for only a bit more than pennies) is of great interest.

Ethernet transformers are available in a range of packages, from those built-in to sockets like the one on the Pi through various surface-mount packages and even through-hole versions on older devices. They come in three main categories which as you might expect correspond to the different Ethernet standards. 10 Mbit components usually have two transformers in one package and a bandwidth from about 100 kHz to 10 MHz, 100 Mbit ones have two with a 300 kHz to 100 MHz bandwidth, and 1000 Mbit ones like the Pi component described above have four 300 kHz to 100 MHz transformers. The transformers themselves are wound onto tiny ferrite rings, and are almost always 1:1 centre-tapped with a 600 Ω impedance and a common-mode choke on one side as described above.

An Ethernet transformer providing a balanced mixer input to a home-built HF receiver.
An Ethernet transformer providing a balanced mixer input to a home-built HF receiver.

Within those limitations they can be used for a variety of small-signal RF tasks that can work within those impedance ranges. By using only one half of the winding that doesn’t have the common-mode choke they can be configured as a 1:2 transformer with a 300 Ω input and a 600 Ω output, for example. In bandwidth terms they also have useful performance some way beyond their advertised specification, for example here in Europe they will pass through long wave broadcast stations at around 150 kHz to 200 kHz, and at the other end of their range they will pass through FM broadcast stations at 108 MHz. Within that range you have the entirety of HF and the lower end of VHF, allowing for example the use within every amateur band from the MF bands to 6 m and 4 m if your country has an allocation. Where this is being written they are the go-to transformer in homebrew receivers and small-signal stages for example, where they are infinitely preferable to winding a toroid.

Sometimes it is easy to believe that an application-specific component such as an Ethernet transformer could have no other uses, but it’s worth reading data sheets and asking the question as to whether any useful components lurk unnoticed in your junk bin. Take a second look, you’d be surprised what you might find!

Counting Bees With A Raspberry Pi

The content below is taken from the original ( Counting Bees With A Raspberry Pi), to continue reading please visit the site. Remember to respect the Author & Copyright.

Even if keeping bees sounds about as wise to you as keeping velociraptors (we all know how that movie went), we have to acknowledge that they are a worthwhile thing to have around. We don’t personally want them around us of course, but we respect those who are willing to keep a hive on their property for the good of the environment. But as it turns out, there are more challenges to keeping bees than not getting stung: you’ve got to keep track of the things too.

Keeping an accurate record of how many bees are coming and going, and when, is a rather tricky problem. Apparently bees don’t like electromagnetic fields, and will flee if they detect them. So putting electronic measuring devices inside of the hive can be an issue. [Mat Kelcey] decided to try counting his bees with computer vision, and so far the results are very promising.

After some training, a Raspberry Pi with a camera can count how many bees are in a given image to within a few percent of the actual number. Getting an accurate count of his bees allows [Mat] to generate fascinating visualizations about his hive’s activity and health. With real-world threats such as colony collapse disorder, this type of hard data can be crucial.

This is a perfect example of a hack which might not pertain to many of us as-is, but still contains a wealth of information which could be applicable to other projects. [Mat] goes into a fantastic amount of detail about the different approaches he tried, what worked, what didn’t, and where he goes from here. So far the only problem he’s having is with the Raspberry Pi: it’s only able to run at one frame per second due to the computational requirements of identifying the bees. But he’s got some ideas to improve the situation.

As it so happens, we’ve covered a few other methods of counting bees in the past, though this is the first one to be entirely vision based. Interestingly, this method is similar to the project to track squirrels in the garden. Albeit without the automatic gun turret part.

Receiving and handling HTTP requests anywhere with the Azure Relay

The content below is taken from the original ( Receiving and handling HTTP requests anywhere with the Azure Relay), to continue reading please visit the site. Remember to respect the Author & Copyright.

If you followed Microsoft’s coverage from the Build 2018 conference, you may have been as excited as we were about the new Visual Studio Live Share feature that allows instant, remote, peer-to-peer collaboration between Visual Studio users, no matter where they are. One developer could be sitting in a coffee shop and another on a plane with in-flight WiFi, and yet both can collaborate directly on code.

The "networking magic" that enables the Visual Studio team to offer this feature is the Azure Relay, which is a part of the messaging services family along with Azure Service Bus, Azure Event Hubs, and Azure Event Grid. The Relay is, indeed, the oldest of all Azure services, with the earliest public incubation having started exactly 12 years ago today, and it was amongst the handful of original services that launched with the Azure platform in January 2010.

In the meantime, the Relay has learned to speak a fully documented open protocol that can work with any WebSocket client stack, and allows any such client to become a listener for inbound connections from other clients, without needing inbound firewall rules, public IP addresses, or DNS registrations. Since all inbound communication terminates inside the application layer instead of far down at the network link level, the Relay is also an all around more secure solution for reaching into individual apps than using virtual private network (VPN) technology.

In time for the Build 2018 conference, the Relay learned a brand-new trick that you may have missed learning about amidst the torrent of other Azure news, so we’re telling you about it again today: The Relay now also supports relayed HTTP requests.

This feature is very interesting for applications or application components that run inside of containers and where it’s difficult to provide a public endpoint, and is especially well-suited for implementing Webhooks that can be integrated with Azure Event Grid.

The Relay is commonly used in scenarios where applications or devices must be reached behind firewalls. Typical application scenarios include the integration of cloud-based SaaS applications with points of sale or service (shops, coffee bars, restaurants, tanning salons, gyms, repair shops) or with professional services offices (tax advisors, law offices, medical clinics). Furthermore, the Relay is increasingly popular with corporate IT departments who use relay based communication paths instead of complex VPN setups. We will have more news regarding such scenarios in the near future.

The new HTTP support lets you can create and host a publicly reachable HTTP(-S) listener anywhere, even on your phone or any other device, and leave it to the Azure Relay service to provide a resolvable DNS name, a TLS server certificate, and a publicly accessible IP address. All your application needs is outbound Websocket connectivity to the Azure Relay over the common HTTPS port 443.

For illustration, let’s take a brief look at a simple Node.js example. First, here’s a minimal local HTTP listener built with Node.js “out of the box”:

var http = require('http');
var port = process.env.PORT || 1337;

http.createServer(function (req, res) {
     res.writeHead(200, { 'Content-Type': 'text/plain' });
     res.end('Hello World\n');
}).listen(port);

This is the equivalent Node.js application using the Azure Relay:

var http = require('hyco-https');

var uri = http.createRelayListenUri("cvbuild.servicebus.windows.net", "app");
var server = http.createRelayedServer({
       server: uri,
       token: () => http.createRelayToken(uri, "listen", "{…key…}")
     },
     function (req, res) {
         res.writeHead(200, { 'Content-Type': 'text/plain' });
         res.end('Hello World\n');
}).listen();

The key changes are that the Relay application uses the ‘hyco-https’ module instead of Node.js’ built-in ‘http’ module, and that the server is created using the ‘createRelayedServer’ method supplying the endpoint and security token information for connecting to the Relay. Most important: The Node.js HTTP handler code is completely identical.

For .NET Standard, we have extended the existing Relay API in the latest preview of the Microsoft.Azure.Relay NuGet package to also support handling HTTP requests. You create a HybridConnectionListener as you do for Websocket connections, and then just add a RequestHandler callback.

var listener = new HybridConnectionListener(uri, tokenProvider);
listener.RequestHandler = (context) =>
{
     context.Response.StatusCode = HttpStatusCode.OK;
     context.Response.StatusDescription = "OK";
     using (var sw = new StreamWriter(context.Response.OutputStream))
     {
         sw.WriteLine("hello!");
     }
     context.Response.Close();
};

If you want to have existing ASP.NET Core services listen for requests on the Relay, you use the new Microsoft.Azure.Relay.AspNetCore NuGet package that was just released and that allows hosting existing ASP.NET Core apps behind the Relay by adding the "UseAzureRelay()" extension to the web host builder and configuring the connection string of a Hybrid Connection shared access rule (see readme, more samples).

    public static IWebHost BuildWebHost(string[] args) =>
             WebHost.CreateDefaultBuilder(args)
                 .UseStartup<Startup>()
                 .UseAzureRelay(options =>
                 {
                     options.UrlPrefixes.Add(connectionString);
                 })
                 .Build();

The HTTP feature is now in preview in production, meaning you can use it alongside the existing Relay features, complete with support and SLA. Because it’s in preview and not yet entirely in its final shape, we might still make substantial changes to the HTTP-related wire protocol.

Since the Relay isn’t a regular reverse proxy, there are some lower level HTTP details that the Relay overrides and that we will eventually try to align. An exemplary known issue of that sort is that the Relay will always transform HTTP responses into using chunked transfer encoding; while that is might be fairly annoying for protocol purists, it doesn’t have material impact on most application-level use cases.

To get started, check out the tutorials for C# or Node.js and then let us know via the feedback option below the tutorials whether you like the new HTTP feature.

UK puts legal limits on drone flight heights and airport no-fly zones

The content below is taken from the original ( UK puts legal limits on drone flight heights and airport no-fly zones), to continue reading please visit the site. Remember to respect the Author & Copyright.

The UK has announcednew stop-gap laws for drone operators restricting how high they can fly their craft — 400ft — and prohibiting the devices from being flown within 1km of an airport boundary. The measures will come into effect on July 30.

The government says the new rules are intended to enhance safety, including the safety of passengers of aircraft — given a year-on-year increase in reports of drone incidents involving aircraft. It says there were 93 such incidents reported in the country last year, up from 71 the year before.

And while the UK’s existing Drone Code(which was issued in 2016) already warns operators to restrict drone flights to 400ft — and to stay “well away” from airports and aircraft — those measures are now being baked into law, via an amendment to the 2016 Air Navigation Order (ahead of a full drone bill which was promised for Spring but still hasn’t materialized yet).

UK drone users who flout the new height and airport boundary restrictions face being charged with recklessly or negligently acting in a manner likely to endanger an aircraft or any person in an aircraft — which carries a penalty of up to five years in prison or an unlimited fine, or both.

Additional measures are also being legislated for, as announced last summer — with a requirement for owners of drones weighing 250 grams or more to register with the Civil Aviation Authority and for drone pilots to take an online safety test.

Users who fail to register or sit the competency tests could face fines of up to £1,000. Though those requirements will come into force later, on November 30 2019.

Commenting in a statement, aviation minister Baroness Sugg said: “We are seeing fast growth in the numbers of drones being used, both commercially and for fun. Whilst we want this industry to innovate and grow, we need to protect planes, helicopters and their passengers from the increasing numbers of drones in our skies. These new laws will help ensure drones are used safely and responsibly.”

In a supporting statement, Chris Woodroofe, Gatwick Airport’s COO, added: “We welcome the clarity that today’s announcement provides as it leaves no doubt that anyone flying a drone must stay well away from aircraft, airports and airfields. Drones open up some exciting possibilities but must be used responsibly. These clear regulations, combined with new surveillance technology, will help the police apprehend and prosecute anyone endangering the traveling public.”

Drone maker DJI also welcomed what it couched as a measured approach to regulation. “The Department for Transport’s updates to the regulatory framework strike a sensible balance between protecting public safety and bringing the benefits of drone technology to British businesses and the public at large,” said Christian Struwe, head of public policy Europe at DJI.

“The vast majority of drone pilots fly safely and responsibly, and governments, aviation authorities and drone manufacturers agree we need to work together to ensure all drone pilots know basic safety rules. We are therefore particularly pleased about the Department for Transport’s commitment to accessible online testing as a way of helping drone users to comply with the law.”

Last fall the UK government also announced it plans to legislate to give police more powers to ground drones to prevent unsafe or criminal usage — measures it also said it would include in the forthcoming drone bill.

Use a To-Do App to Take Notes

The content below is taken from the original ( Use a To-Do App to Take Notes), to continue reading please visit the site. Remember to respect the Author & Copyright.

When out and about, I used to put all my ideas into a pocket notebook. Then I switched to emailing myself from my phone. Then I tried the Notes app. Now I put them in Wunderlist, a to-do app. It’s not my favorite to-do app—Microsoft even released another app to replace it—because I use my favorite to-do app for my…

Read more…

ASUS’ latest crypto-mining motherboard can handle 20 GPUs

The content below is taken from the original ( ASUS’ latest crypto-mining motherboard can handle 20 GPUs), to continue reading please visit the site. Remember to respect the Author & Copyright.

ASUS is moving further into the cryptocurrency hardware market with a motherboard that can support up to 20 graphics cards, which are typically used for mining. The H370 Mining Master uses PCIe-over-USB ports for what ASUS says is sturdier, simpler c…

Skydio’s self-flying drone can now track down cars

The content below is taken from the original ( Skydio’s self-flying drone can now track down cars), to continue reading please visit the site. Remember to respect the Author & Copyright.

Skydio‘s first major update to their crazy cool self-flying drone fixes its 13 eyes on a new object to follow at high speeds: cars.

The Bay Area startup has expanded following capabilities of its R1 drone beyond just humans, with cars now firmly within their sights. Now, you’ll still be limited by the devices 25mph so this won’t be shooting any Nascar races, but the self-flying drone will be able to track and follow vehicles as they move through challenging terrain that would be impossible to film previously without a skilled drone pilot.

Just don’t send this thing following after a self-driving car — unless you want the two to probably run away together and come back with a vengeance at a later date.

In our review of the R1 drone, we were struck by the strength of its core tech and excited by the promise offered by future software updates. Well, less than two months later, new functionality is already coming to the device with this big new update.

“With Skydio R1, cinematography becomes a software defined experience,” Skydio CEO Adam Bry said in a statement. “That means we can regularly introduce fundamentally new capabilities over time for all existing and future users.”

In addition to the new car mode, Skydio has also updated its Lead mode which aims to plot a user’s path before they take it and shoot footage accordingly. The company says that the new update will bring “more intelligent behavior” when it comes to navigating obstacles. New “quarter lead” and “quarter follow” modes also shift the perspective from only allowing straight-on or profile shots.

The Skydio R1 Frontier Edition goes for a decently pricey $2,499 and the new update goes live today .

New capabilities to enable robust GDPR compliance

The content below is taken from the original ( New capabilities to enable robust GDPR compliance), to continue reading please visit the site. Remember to respect the Author & Copyright.

Today marks the beginning of enforcement of the EU General Data Protection Regulation (GDPR), and I’m pleased to announce that we have released an unmatched array of new features and resources to help support compliance with the GDPR and the policy needs of Azure customers.

New offerings include the general availability of the Azure GDPR Data Subject Request (DSR) portal, Azure Policy, Compliance Manager for GDPR, Data Log Export, and the Azure Security and Compliance Blueprint for GDPR.

In our webcast today, President Brad Smith outlined our commitment to making sure that our products and services comply with the GDPR, including having more than 1,600 engineers across the company working on GDPR projects. As Brad noted, we believe privacy is a fundamental human right, and that individuals must be in control of their data. So I am pleased that Azure is part of keeping that commitment by being the only hyperscale cloud provider to offer the level of streamlined mechanisms and tools for GDPR compliance enforcement we are announcing today.

Azure Data Subject Request (DSR) portal enables you to fulfill GDPR requests. The DSR capability is generally available today through the Azure portal user interface, as well as through pre-existing application programming interfaces (APIs) and user interfaces (UIs) across the breadth of our online services. These capabilities allow customers to respond to requests to access, rectify, delete, and export personal data in the cloud. In addition, Azure enables customers to access system-generated logs as a part of Azure services.

Azure Policy enables you to set policies to conform to the GDPR. Azure Policy is generally available today at no additional cost to Azure customers. You can use Azure Policy to define and enforce policies that help your cloud environment become compliant with internal policies as well as external regulations.

Azure Policy is deeply integrated into Azure Resource Manager and applies across all resources in Azure. Individual policies can be grouped into initiatives to quickly implement multiple rules. You can also use Azure Policy in a wide range of compliance scenarios, such as ensuring that your data is encrypted or remains in a specific region as part of GDPR compliance. Microsoft is the only hyperscale cloud provider to offer this level of policy integration built in to the platform for no additional charge.

Extend Azure Policies for the GDPR into Azure Security Center. Azure Security Center provides unified security management and advanced threat protection to help meet GDPR security requirements. With Azure Policy integrated into Security Center, you can apply security policies across your workloads, enable encryption, limit your exposure to threats, and help you respond to attacks.

The Azure Security and Compliance GDPR Blueprint accelerates your GDPR deployment. This new Azure Security and Compliance Blueprint will help you build and launch cloud-powered applications that meet GDPR requirements. It includes common reference architectures, deployment guidance, GDPR article implementation mappings, customer responsibility matrices, and threat models that enable you to quickly and securely implement cloud solutions.

Compliance Manager for Azure helps you assess and manage GDPR compliance. Compliance Manager is a free, Microsoft cloud services solution designed to help organizations meet complex compliance obligations, including the GDPR, ISO 27001, ISO 27018, and NIST 800-53. Generally available today for Azure customers, the Compliance Manager GDPR dashboard enables you to assign, track, and record your GDPR compliance activities so you can collaborate across teams and manage your documents for creating audit reports more easily. Azure is the only hyperscale cloud provider with this functionality.

Azure GDPR support and guidance help you stay compliant. Our GDPR sites on the Service Trust Portal and the Trust Center provide you with current information about Microsoft services that support the requirements of the GDPR. These include detailed guidance on conducting Data Protection Impact Assessments in Azure, fulfilling DSRs in Azure, and managing Data Breach Notification in Azure for you to incorporate into your own GDPR accountability program.

Global Regions help you meet your data residency requirements. Azure has more global regions than any other cloud provider, offering the scale you need to bring applications closer to people around the world, preserve data residency, and give customers the confidence that their data is in under their control.

Microsoft has a long-standing commitment to privacy and was the first cloud provider to achieve certification for the EU Model Clauses and ISO/IEC 27018, and was the first to contractually commit to the requirements of the GDPR. Azure offers 11 privacy-focused compliance offerings, more than any other cloud provider. We are proud to be the first to offer customers this level of GDPR functionality.

Through the GDPR, Azure has strengthened its commitment to be first among cloud providers in providing a trusted, private, secure, and compliant private cloud. We are continuing to build and release new features, tools, and supporting materials for our customers to comply with the GDPR and other important standards and regulations. We are proud to release these new capabilities and invite you to learn more in the Azure portal today.

Handling GDPR Right to Erasure Requests for Office 365

The content below is taken from the original ( Handling GDPR Right to Erasure Requests for Office 365), to continue reading please visit the site. Remember to respect the Author & Copyright.

Happy GDPR Day

Happy GDPR Day

GDPR Becomes Reality

The European Union General Data Protection Regulation (GDPR) comes into force today and we move from preparation to reality. Maybe now the flood of email asking for consent to remain on mailing lists will abate and we won’t see quite so many people trying to make hay from GDPR FUD. It’s not quite as bad when The Irish Times reported that “an army of advisors, some of them chancers, have fanned out in recent months to make GDPR the most profitable cash cow/scare story since the millennium bug,” but it has come close.

In any case, organizations must now cope with the requirements set down in GDPR, which means that practical interpretations of what needs to be done with IT systems are the order of the day. Lots of preparatory work has no doubt been done; now it’s game time.

Two practical issues that Office 365 tenants might be asked to deal with soon are Data Subject Requests and Data Erasure Requests, defined under Articles 15 and 17 respectively. Office 365 has an off-the-shelf (partial) answer for one; how to handle the other is not as obvious.

Data Subject Requests

The release of support for GDPR Data Subject Request (DSR) cases in the Security and Compliance Center is a welcome step to help Office 365 tenants cope with the new regulations. However, discovering what personal information exists in Exchange, SharePoint, OneDrive, and Teams in response to a request to know what a data controller (an Office 365 tenant) holds about a data subject (a person) is only the start of the journey.

A DSR is a modified form of a standard Office 365 eDiscovery case. The search results returned by the DSR criteria are deliberately broad to uncover everything a tenant holds about someone. For example, searching by someone’s name will find information, but that doesn’t mean that the search results are relevant to the data subject, especially if their name is common. The information found in a scan Office 365 probably includes many messages and files that don’t match the request, which means that some careful checking is necessary before anything is handed over.

Right to Erasure

The natural progression from searching to respond to an article 15 right of access request is when a data subject exercises their article 17 right to erasure. In other words, someone asks an organization to remove any personal information held about them without undue delay.

GDPR sets out several grounds to justify removal, including that personal data are no longer necessary for the purpose they were collected, the data subject withdrawing consent, or the data subject objects to how the controller processes their data.

For example, an ex-employee might ask their employer to remove all personal information held about them. This includes information like their personal email address and phone number, their national identification number, passport number, and other items of personal data that are not ordinarily in the public domain.

However, the data controller can argue that some information must be retained to comply with a legal obligation (article 17-3b) or to assist with legal claims (article 17-3e). For instance, an employer might need to keep tax records for an ex-employee for several years to comply with national tax regulations.

Deciding what personal data should be removed in response to right to erasure requests is an imprecise science at present. We probably need some guidance from the courts to establish exact boundaries for the data that must be removed and that which can be kept.

Office 365 and Personal Data

Office 365 is only one of the repositories where personal data lives within an organization, but given the pervasive nature of email for communications, and Word and Excel for documenting and organizing HR data, it’s likely that a lot of personal data exists within mailboxes and sites. Any request to erase requests that arrive into an organization using Office 365 means that searches are needed across:

  • Exchange: user, shared, group, and public folder mailboxes (and Skype for Business IM transcripts stored in user mailboxes).
  • SharePoint and OneDrive sites.
  • Yammer.

Any personal data of interest in Teams conversations should be picked up in the compliance records captured for Teams in user and group mailboxes.

Office 365 DSRs are Starting Points for Erasure

Office 365 DSRs give a good start to solving the erasure dilemma because the output from searches show where personal data for the data subject might exist. Yammer is the outlier here because Yammer content is not scanned by Office 365 content searches, so searches and exports of Yammer data must be processed separately. On the upside, given how Yammer is generally used, it’s unlikely that much personal data exists in Yammer groups.

When you export the results of content searches, Office 365 generates manifests to show where the exported data originates. As noted above, it’s a mistake to assume that everything uncovered by a DSR case is relevant to a data subject, and manual checking is absolutely needed before any deletions occur. The export manifests are invaluable here because they tell those responsible for processing the request for erasure where to look.

The Need for Checking

Unfortunately, checking search results is a manual process. Before you delete messages or documents, you need to be sure that those items are relevant to the data subject and do not need to be kept for justifiable business reasons. For example, a check of a document might look for instances of the data subject’s name together with other indications that the document should be removed, such as it includes the data subject’s Social Security Number or passport number.

For this reason, the content searches used to find matches should use precise identifiers whenever possible. A DSR case can span several cases, so you can have one based on the data subject’s name and email address, and another for matches against their passport number, employee number, home address or a similarly unique identifier. You can export the combined results of all searches in a single operation.

Redaction

In many cases, the requirement for erasure can be satisfied through redaction, or editing to erase the data subject’s details from documents, spreadsheets, and other files. You cannot edit the body of an email, so these probably need to be removed. One complication that exists here is that some content might be protected by Azure Information Protection rights management. In this instance, protected files must be decrypted by an IRM super-user before they can be redacted.

Document processing is complicated by the fact that SharePoint stores multiple versions of a file, meaning that although you might redact the text relating to a data subject in the current version of a document, other versions still exist that might include the information. To get around the problem, you can save a copy of the document, remove the original document, and make the change to the copy before uploading it (as version 1) to SharePoint.

Inactive Mailboxes

Information in inactive mailboxes is indexed and discoverable, so content searches will pick up any references that exist in these mailboxes. To remove items, you’ll have to restore or recover the inactive mailboxes before you can access the content with clients like Outlook or OWA.

Preservation Locks

Some items cannot be deleted from Office 365 because they are subject to a preservation lock, a special form of retention policy designed to keep information for a predetermined period that cannot be interfered with. Office 365 will keep these items until the lock expires.

No Automatic Erasure

The bottom line is that responding to a request for erasure of Office 365 data under GDPR article 17 is unlikely to be an automatic or inexpensive process. Some simple cases might be processed by doing a search and then using something like the Search-Mailbox cmdlet to permanently remove items from mailboxes. However, the increasingly integrated nature of Office 365 means that those responsible for handling these cases can expect to do a lot of manual work to be sure that the organization responds as GDPR expects.

More Help in the Future?

We don’t know yet whether Microsoft will develop DSRs further to include processing to handle requests for erasure, or the article 18 right of restriction of processing, where a data subject contests the accuracy of their personal data held in a system like Office 365. In all cases, as noted above, depending on automatic processing without checking is not a good idea because the chance that you’ll erase something important is high. Maybe this is a case when artificial intelligence can help. Time will tell.

Follow Tony on Twitter @12Knocksinna.

Want to know more about how to manage Office 365? Find what you need to know in “Office 365 for IT Pros”, the most comprehensive eBook covering all aspects of Office 365. Available in PDF and EPUB formats (suitable for iBooks) or for Amazon Kindle.

The post Handling GDPR Right to Erasure Requests for Office 365 appeared first on Petri.

Revolut adds Ripple and Bitcoin Cash support

The content below is taken from the original ( Revolut adds Ripple and Bitcoin Cash support), to continue reading please visit the site. Remember to respect the Author & Copyright.

Fintech startup Revolut is adding Bitcoin Cash and Ripple to its cryptocurrency feature. While cryptocurrency isn’t really Revolut’s focus point, it’s a good way to get started with cryptocurrencies.

If you have a Revolut account, you can now buy and hold Bitcoin, Litecoin, Ethereum, Ripple and Bitcoin Cash. Behind the scene, the startup has partnered with Bitstamp to process the transactions. Revolut currently charges a 1.5 percent fee for cryptocurrency transactions. There are currently 100,000 cryptocurrency transactions per day.

Compared to a traditional cryptocurrency exchange, you can’t send or receive cryptocurrencies from your Revolut account. You don’t get a bitcoin address for instance. All you can do is buy tokens in the app. If you want to transfer those tokens somewhere else, you’ll have to sell them for USD, GBP, etc. and then buy cryptocurrencies on a traditional exchange using your fiat money.

Recently, the startup also announced a new feature called Vaults. Revolut users can set up a vault to save money over time.

You can round up your spare change every time you make a transaction. For instance, if you pay $3.47 for that delicious ice cream, you’ll save 53 cents in your vault. You can also multiple that amount so that you save multiple times your spare change with each transaction. Many fintech startups also provide this feature.

You can also set up recurring payments to set aside a bit of money each day, each week or each month. Interestingly, you get to choose the currency of your vault. So it means that you can decide to buy ethers with spare change and weekly payments for instance. It’s a great way to hedge against the volatility of cryptocurrencies.

Users don’t earn interests on vaults. It’s just a way to set some money aside that doesn’t appear in your main Revolut account. You can decide to close your vault whenever you want.