Samsung’s extra-stretchable display can survive dents

The content below is taken from the original (Samsung’s extra-stretchable display can survive dents), to continue reading please visit the site. Remember to respect the Author & Copyright.

Flexible displays are nothing new. However, most of them don’t live up to the dreams of flexible tech — they may only bend in a limited way. Samsung thinks it can do better. It just unveiled a 9.1-inch prototype OLED display that’s stretchable in seemingly every way imaginable: you can bend, roll and even dent it (up to half an inch deep) knowing that it’ll revert to its original form. The technology is still very young, but Samsung believes the stretchy screen will be useful for everything from wearables to in-car displays. Imagine a very thin smartwatch that can take some knocks without smashing into pieces.

Samsung 9.1-inch 'stretchable' OLED with pushing comparison

That’s not the only trick Samsung has up its sleeve, either. It’s showing off a 1.96-inch 4K LCD whose ridiculously high 2,250 pixels per inch density would be ideal for virtual reality. You might not notice the distracting "screen door" effect (where you can see the gaps between pixels) common to VR headsets. There’s again no roadmap, but it’s just as well when even PC-based VR still requires a fairly speedy computer. It could be a while before enough people have PCs that can handle this extra-high resolution at the frame rates you need for smooth VR.

To top it all off, Samsung is also exhibiting a 5.09-inch OLED with "glassless" stereoscopic 3D — yes, it’s still kicking around the concept despite the decline of 3D technology. The use of OLED should offer more natural-looking results than an LCD, Samsung says. There is the risk of a panel being used for gimmickry (Samsung talks about games and pop-up books), but it could also add depth to VR experiences.

Via: Korea Herald, TechSpot

Source: Samsung Display (translated)

Huawei’s first laptop is a MacBook clone in looks alone

The content below is taken from the original (Huawei’s first laptop is a MacBook clone in looks alone), to continue reading please visit the site. Remember to respect the Author & Copyright.

Hot on the heels of Microsoft announcing the Surface Laptop, Huawei is ready to unveil its first real Windows 10 notebook. It’s called the MateBook X, and it might remind you of Apple’s 12-inch MacBook. In fact, the resemblance is so striking that I could barely tell the two apart without glancing at their logos. Although it looks very similar to Apple’s diminutive laptop, the MateBook actually has a few important (and useful) differences.

First of all, it has a fingerprint sensor integrated into the power button that not only makes it easier to sign into your profile, but is also more convenient if more than one person is a frequent user of your laptop. All someone has to do is sign in with their finger and the laptop will load up their profile.

Apple’s butterfly keyboard was controversial for its basically travel-free setup, and even though I’m pleased with the typing experience on my MacBook, I’m glad Huawei managed to deliver keys with 1.2mm of travel here. It’s a more traditional and familiar system that makes typing more comfortable. I also like the generously sized chiclet keys, and the only buttons that appear undersized are the up and down arrows. The keyboard is also splash-resistant so you don’t have to worry about spilling a little water on it while you work.

Huawei also equipped the MateBook X with a seventh-generation Intel Core i7 chip, 8GB of RAM and a 512GB SSD, which should promise speed performance. I’ve only used the laptop for light multi-tasking so far, but in general it keeps pace with my needs.

The MateBook X’s battery is supposed to last a long time, too, with company touting up to 10 hours of 1080p video playback on a charge. Speaking of video, you might appreciate the multimedia experience, not only because of the MateBook’s vibrant 13-inch 2K display, but also thanks to the Dolby Atmos-enhanced surround sound system. Unfortunately during my preview period, the display on my unit seemed dim in bright sunlight, but the audio was indeed immersive and loud.

Like the MacBook, Huawei’s laptop adopts a minimal approach to ports, which is likely to put off some people. You’ll only get two USB C slots, one of which feeds power to the device while you’re left with the other for external connections. To alleviate the potential inconvenience here, Huawei is including a dock with each MateBook X that provides options for HDMI, USB A, USB C and VGA. Unfortunately, though, there’s no microSD card reader.

Although the MateBook X basically stole its design from the MacBook, I still appreciate its skinny profile and light weight. I also dig its shiny chrome edges and the pink and blue color options available. Plus, Huawei has consistently been on point with its product aesthetics since the original MateBook 2-in-1 from last year, which sports an equally premium build and classy look. The MateBook X may look familiar, but I won’t dock Huawei points for crafting something that ultimately feels elegant.

We don’t yet know how much the MateBook X will cost, but we do know it’s expected to go on sale in the US this summer. A larger, more powerful version called the MateBook D will also be available, along with the MateBook E — an updated version of the company’s 2-in-1. Huawei’s first stab at a laptop comes with just one or two unique features, but at first glance it appears the MateBook X has the basics covered. Whether it will do well as a workhorse, though, remains to be seen.

LinkedIn finds friends to join its ‘Open19’ data centre standards effort

The content below is taken from the original (LinkedIn finds friends to join its ‘Open19’ data centre standards effort), to continue reading please visit the site. Remember to respect the Author & Copyright.

LinkedIn wants you to brick it in the data centre by following it and its friends with a new standard for data centre hardware that pushes its ambitions to the edge and into competition with the Facebook-derived Open Compute Project.

Microsoft’s data-harvesting firm first floated the idea of defining data centre hardware last year, naming it Open19 and stating the ambition to “establish a new open standard for servers based on a common form factor.”

If that sounds an awfully lot like the aims of Open Compute, you’re not alone in thinking so.

Which may be why on Wednesday LinkedIn changed its tune slightly, announcing that Open19 is now about “a new generation of open data centers and edge solutions” (Reg emphasis). There’s also a new Open19 Foundation that aims to “optimize for any size data centers to a degree that was previously only practical for very large data centers.”

Open 19 will produce designs for large data centers with over 100,000 servers, “large data centers with open slots only, and small edge platforms.”

To The Register‘s mind, the important difference between Open19 and Open Compute is the intention to define a single set of kit that will run anywhere. That’s important because it’s increasingly assumed that public clouds will often be too expensive or too slow to handle data gathered by sensors.

Putting computing muscle closer to where data is made – “the edge” is therefore attracting plenty of interest. If you need edge computing, it may be that you would prefer that it uses the same kit and designs as the stuff you use in your main bit barns. Open19 looks to have set out its stall on hardware homogeneity no matter where you run. Indeed, it is promising common components for infrastructure.

Participants include GE Digital, HPE and edge data centre upstart Vapor IO, and they’ve developed the idea of a “brick cage” a chassis somewhat akin to a blade chassis, offered in 12U or 8U configurations and designed for quick and easy cabling for the “bricks” that will reside within. Those bricks can be servers or storage and come in different sizes. The idea appears to be that you’ll populate each brick cage with kit that meets your needs, then hook it up to a full-width switch or “power shelf”.

Open19 says the bricks will come together as “standard building blocks” with the following specs:

  • Standard 19” 4-post rack
  • Brick cage
  • Brick (B), Double Wide Brick (DWB), Double High Brick (DHB), Double High & Wide Brick (DHWB)
  • Power shelf: 12v distribution, OTS power modules with any AC or DC inputs
  • Optional Battery Backup Unit (BBU)
  • Networking switch (ToR)
  • Snap-on power cables – up to 400w per brick, linear growth with size
  • Snap-on data cables – up to 100G per brick, linear growth with size
Open 19's rack and brick design

Bricking it: Open19’s chassis design

Importantly, it appears that Open19 isn’t mandating particular servers: if a machine fits into one of the four brick specs, it should get along fine. White box server-makers SuperMicro, Inspur, Wiwynn and QCT have signed up, as have the likes of Broadcom, Mellanox and Schneider Electric. So the whole data centre gang is here and ready to play.

All Open19 needs now is users. Which is of course the hard part. ®

Cloud Source Repositories: now GA and free for up to five users and 50GB of storage

The content below is taken from the original (Cloud Source Repositories: now GA and free for up to five users and 50GB of storage), to continue reading please visit the site. Remember to respect the Author & Copyright.

By Chris Sells, Product Manager

Developers creating applications for App Engine and Compute Engine have long had access to Cloud Source Repositories (CSR), our hosted Git version control system. We’ve taken your feedback to get it ready for the enterprise, and are excited to announce that it’s leaving beta and is now generally available.

The new CSR includes a number of changes. First off, we’ve increased the supported repository size from 1GB to 50GB, which should give your team plenty of room for large projects.

Second, CSR has a new pricing model, complete with a robust free tier that should allow many of you to use it at no cost. Customers can use CSR associated with their billing accounts for free each month, provided that the repos meet the following criteria:

  • Up to five project-users accessing repositories
  • Source repos consume less than 50GB in storage
  • Access to repos uses less than 50GB of network egress bandwidth

Beyond that, pricing for CSR is $1/project-user/month (where a project-user represents each user working on each project) plus $0.10/GB/month for storage and $0.10/GB for network egress. Network ingress is offered at no cost and you can still create an unlimited number of repositories.

For further details, visit the Cloud Source Repositories pricing page.

Getting started with Cloud Source Repositories

To get started with CSR, go to http://bit.ly/2qcVtQO or choose Source Repositories from the Cloud Console menu:

Creating a CSR repo is as easy as pressing the "Get started" button in the Cloud Console and providing a name:

Or if you prefer, you can create a new repo from the gcloud command line tool, either from your local shell (make sure to execute “gcloud init” first) or from the Cloud Shell:

Once you’ve created your repo, browse it from the Source Repositories section of the Cloud Console or clone it to your local machine (making sure you’ve executed “gcloud init” first) or into the Cloud Shell:

Or, if you’re using Cloud Tools for IntelliJ (and soon our other IDE extensions), you can access your CSR repos directly from inside your favorite IDE:

As you’d expect, you can use standard git tooling to commit changes and otherwise manage your new repos. Or, if you’ve already got your source code hosted on GitHub or BitBucket, you can mirror your existing repo into your GCP project, like so

Once you’ve created your repos, manage them with the Repositories section in the Cloud Console:

If you prefer using command line tools, there’s a full set of CLI commands available:

You’ll also notice the reference to Permissions in the Cloud Console and IAM policies at the command line; that’s because IAM roles are fully-supported in CSR and can be applied at any level in the resource hierarchy.

And as if all of that weren’t enough, there’s a CSR management API as well, which is what we use ourselves to implement the gcloud CSR commands. If you’d like to get a feel for it, you can access the CSR API interactively in the Cloud API Explorer:

Full documentation for the CSR API is available for your programming pleasure.

Where are we?

Like our Cloud Shell and it’s new code editor, the new CSR represents a larger push toward web-based experience for GCP developers. We’re thrilled with the feedback we’ve already gotten and look forward to hearing how you’re using CSR in your developer workflow.

If you’ve got questions about Cloud Source Repositories, feel free to drop them onto StackOverflow. If you’ve got feedback or suggestions, feel free to join in the discussion on Google Groups or Slack.

Public Cloud makes it to Africa for the first time

The content below is taken from the original (Public Cloud makes it to Africa for the first time), to continue reading please visit the site. Remember to respect the Author & Copyright.

Microsoft has announced that it will offer Microsoft Azure, Office 365 and Dynamics 365 from data centres in the South African cities Cape Town and Johannesburg next year.

We won’t go to deep into Microsoft’s typically ShinyHappy™ announcement, because more interesting than its thrust that Redmond is about to, yet again, Make The World A Better Place is the fact that Azure looks like it will be the first of the major clouds to make it into Africa.

A glance at the world maps of AWS, IBM Bluemix and Google Cloud shows that none of them have a bit barn in Africa, nor have immediate plans to get there. All three do have facilities in India, perhaps an eloquent statement about just where the they think there’s money to be made at this time.

For what it’s worth, Cape Town looks a pretty good place for a bit barn: Telegeography’s submarine cable map tells us that three submarine cables connected to Europe land nearby and head up Africa’s west coast. To the east, the SAFE cable crosses the Indian Ocean and lands in Malaysia. SAFE also connects South Africa to numerous cables that head to Europe through the Red Sea.

The Register imagines the other big clouds will make it to Africa eventually. When they do, they may find Huawei waiting: the company is doing plenty of business in Africa and wants to build a global network of data centres running its FusionSphere cut of OpenStack.

Microsoft’s move means Antarctica is now the only continent without a public cloud. The Register reckons we could see one in space before we see one on the frozen continent. ®

New AWS Training and Certification Portal

The content below is taken from the original (New AWS Training and Certification Portal), to continue reading please visit the site. Remember to respect the Author & Copyright.

AWS Training and Certification can help you get more out of the AWS Cloud.

The new AWS Training and Certification Portal allows you to access and manage your training and certification activities, progress, and benefits – all in one place:

Previously, you had to rely on multiple websites to find and manage training and certification offerings. Now you have a central place where you can find and enroll in AWS Training, register for AWS Certification exams, track your learning progress, and access benefits based on the AWS Certifications you have achieved. This makes it easier for you to build your AWS Cloud skills and advance toward earning AWS Certification.

You can create a new account or simply log in with your existing Amazon account. If you already have an AWS Training account, you can migrate your existing AWS Training history into this new primary account. If you are an APN Partner, you can simply sign in using your APN Portal credentials. If you also had a Webassessor account, be sure to visit the Certification tab and merge this account too.

Once you are set up, you can rely on the AWS Training and Certification Portal to be your place to find the latest AWS training and certification offerings, built by AWS experts.

To learn more, read the AWS Training and Certification Portal FAQs.

Jeff;

 

Azure enables cutting edge Virtual Apps, Desktops and Workstations with NVIDIA GRID

The content below is taken from the original (Azure enables cutting edge Virtual Apps, Desktops and Workstations with NVIDIA GRID), to continue reading please visit the site. Remember to respect the Author & Copyright.

Professional graphics users in every industry count on an immersive, photorealistic, responsive environment to imagine, design, and build everything from airplanes to animated films. Traditionally, these high-powered workstations were tethered to physical facilities and shared among professional users such as designers, architects, engineers, and researchers. But today’s enterprises find themselves operating in multiple geographies, with distributed teams needing to collaborate in real-time.

Hence, last year we released Azure’s first GPU offerings targeting high-end graphics applications. NV based instances are powered by the NVIDIA GRID virtualization platform and NVIDIA Tesla M60 GPUs that provide 2048 CUDA cores per GPU and 8GB of GDDR5 memory per GPU as well. These instances provide over 2x performance increase in graphics-accelerated applications as compared to the previous generations.

Targeting the high-end workstation user, you can run NVIDIA Quadro GPU optimized applications such as Dassault Systems CATIA or Siemens PLM per user directly on the NV instances without the need to deal with the complexity of licensing. Additionally, with up to 4 GPUs via NV24 you’re able to run up to 4 concurrent users utilizing these Quadro applications with features such as multiple displays, larger maximum resolutions and certified Quadro software features from hundreds of software vendors.

Furthermore, if your organization has a need to run Virtual Apps or Virtual Desktops using solutions like RDS, Citrix XenApp Essentials, VMware Horizon, or Workspot, you’re now able to run up to 25 concurrent RDSH users per GPU. Office workers and professionals who don’t require Quadro optimized applications, can finally enjoy virtual desktops with a high-quality user experience that’s optimized for productivity applications. It’s all the performance of a physical PC, where and when you need it. You can now dramatically lower IT operational expense and focus on managing the users instead of PCs.

 

NV6

NV12

NV24

Cores

6

12

24

GPU

1 x M60 GPU

2 x M60 GPUs

4 x M60 GPUs

Memory

56 GB

112 GB

224 GB

Disk

380 GB SSD

680 GB SSD

1.44 TB SSD

Network

Azure Network

Azure Network

Azure Network

Virtual Workstations

1

2

4

RDSH Virtual Apps and Virtual Desktops

25

50

100

“Because so many of today’s modern applications and operating systems require GPU acceleration, organizations are seeking greater flexibility in their deployment and cost options,” says John Fanelli, VP NVIDIA GRID. “With NVIDIA GRID software and NVIDIA Tesla M60s running on Azure, Microsoft is delivering the benefits of cloud-based RDSH virtual apps and desktops to enable broad-scale, graphics-accelerated virtualization in the cloud that meets the needs of any enterprise.”

These new updates will go a long way to making sure that you have the best infrastructure whether you’re running the most graphics demanding CAD application that require Quadro optimization or if you’re just running office productivity applications on the go.

Azure N-Series VMs are now generally available in multiple regions. To launch these VMs please visit the Azure Portal.

image

Drones and AI help stop poaching in Africa

The content below is taken from the original (Drones and AI help stop poaching in Africa), to continue reading please visit the site. Remember to respect the Author & Copyright.

Several organizations are already using drones to fight poaching, but the Lindbergh Foundation is taking it one step further. The environmental non-profit has joined forces with Neurala in order to use the company’s deep learning neural network AI to boost the capabilities of the drones in its Air Shepherd program. Neurala taught its technology what elephants, rhinos and poachers look like, so it can accurately pinpoint and mark them in videos. It will now put the AI to work sifting through all the footage the foundation’s drones beam back in real time, including infrared footage taken at night.

The AI’s job is to pore over these videos and quickly identify the presence of poachers to prevent them from even reaching the animals’ herds. It’s the perfect addition to the Air Shepherd program that aims to use cutting edge software and drones to stop poaching in Africa.

Neurala CEO Max Versace said in a statement:

"This is a terrific example of how AI technology can be a vital force for good. We’re thrilled to be working with the Lindbergh Foundation in this unique partnership, contributing our deep learning software to such a worthwhile cause and doing our part to preserve endangered species."

It’ll be interesting to see how effective the program’s drone-AI system is in the future. For now, you can watch the video below to see how Neurala’s technology identifies objects of interest.

Source: Neurala

Drones and AI help stop poaching in Africa

The content below is taken from the original (Drones and AI help stop poaching in Africa), to continue reading please visit the site. Remember to respect the Author & Copyright.

Several organizations are already using drones to fight poaching, but the Lindbergh Foundation is taking it one step further. The environmental non-profit has joined forces with Neurala in order to use the company’s deep learning neural network AI to boost the capabilities of the drones in its Air Shepherd program. Neurala taught its technology what elephants, rhinos and poachers look like, so it can accurately pinpoint and mark them in videos. It will now put the AI to work sifting through all the footage the foundation’s drones beam back in real time, including infrared footage taken at night.

The AI’s job is to pore over these videos and quickly identify the presence of poachers to prevent them from even reaching the animals’ herds. It’s the perfect addition to the Air Shepherd program that aims to use cutting edge software and drones to stop poaching in Africa.

Neurala CEO Max Versace said in a statement:

"This is a terrific example of how AI technology can be a vital force for good. We’re thrilled to be working with the Lindbergh Foundation in this unique partnership, contributing our deep learning software to such a worthwhile cause and doing our part to preserve endangered species."

It’ll be interesting to see how effective the program’s drone-AI system is in the future. For now, you can watch the video below to see how Neurala’s technology identifies objects of interest.

Source: Neurala

What is dead my never die: a new version of OS/2 just arrived

The content below is taken from the original (What is dead my never die: a new version of OS/2 just arrived), to continue reading please visit the site. Remember to respect the Author & Copyright.

What is dead my never die: a new version of OS/2 just arrived

Game of Clones: ArcaOS 5.0 promises to pick up where OS/2 Warp and eComStation left off

Arca OS 5.0, aka Blue Lion

CD Art for ArcaOS 5.0, code-named Blue Lion

An outfit called Arca Noae has released a new version of IBM’s venerable OS/2 operating system, named ArcaOS 5.0.

The Register understands that Arca Noae has a licence from IBM to do a distribution of OS/2, the OS that Big Blue pitched against Windows 95 back in the day. OS/2’s fourth release, named “Warp”, was widely regarded as technically superior to Windows 95 and Windows NT and could even run apps coded for Win32, but didn’t catch on because of a clunkier GUI and Microsoft’s hardball licensing tactics that made it commercially suicidal for PC-makers to offer the OS.

The OS nonetheless found a home in places like automatic teller machines and other niche applications of the sort IBM was often asked to build.

But sales were never stellar and once it was clear that there was no point in competing with Windows on the desktop IBM took OS/2 out behind the shed with a rifle, ending sales in 2001 and winding up support in 2006.

Users soon petitioned IBM to open source the OS on grounds that it did lots of things well and its code could therefore benefit the coding community, but IBM rejected that petition, on grounds that there was still commercial and legal reasons to hang on to the source code. The legal reasons were easy to understand: OS/2 was initially co-developed with Microsoft so entanglements were to be expected. Others speculated that OS/2 had found its way into military and intelligence agencies who would rather the world did not get a chance to rummage through its source code.

But OS/2 was still out there and users needed help, not least because hardware was crumbling beneath the OS. So IBM allowed an outfit called eComstation to do a version called “eCS/2” that allowed the OS to run on hardware using things like USB that had come along since OS/2’s last version, 4.52, emerged in 2001.

ArcaOS 5.0 now offers itself up as a migration option for users of both eCS/2 or OS/2. The new OS, numbered 5.0 to suggest itself as an heir to OS/2’s last release, promises it can run native OS.2, Windows 3.1 and DOS applications.

The OS can run on an Intel Pentium Pro or higher, or an AMD K6 or higher, but will run in 32-bit mode. A Compatibility Support Module allows the OS to run on UEFI-based machines. Audio is no problem, so long as hardware supports Uniaud 2.0, and USB 1.1 through 3.0 should work. Wired or wireless LANs with chipsets that support “MultiMac, GenMAC, or other drivers” will get the OS online. Virtualization? Yep! VirtualBox, VMware ESXi and VirtualPC should all handle ArcaOS 5.0.

Arca Noae’s selling two versions of ArcaOS 5.0. A US$99 personal edition comes with limited support. Splashing $229 gets you a commercial edition with a couple of years support. Subscriptions promise to deliver a stream of driver and software updates.

If you’re the kind of retro-computing enthusiast who can’t resist this, do let us know how it runs! ®

New smaller Windows Server IaaS Image

The content below is taken from the original (New smaller Windows Server IaaS Image), to continue reading please visit the site. Remember to respect the Author & Copyright.

We continue to find ways to make Azure a better value for our customers. Azure Managed Disks, a new disk service launched in Feb ’17, simplifies the management and scaling of Virtual Machines (VM). You can choose to create an empty Managed Disk, or create a Managed Disk from a VHD in a storage account, or from an Image as part of VM creation.

The pricing of Managed Disks, both Premium and Standard, is based on the provisioned disk size, which is different from the pricing of Standard Unmanaged Disks. To keep the cost lower, we created lower disk pricing options with smaller 32GB and 64GB Standard Managed Disk sizes. Building on that foundation, we have also added a second set of Windows Server offerings with 30GB OS disks for Windows Server 2008R2, Windows Server 2012, Windows Server 2012R2 and Windows Server 2016 in Azure Marketplace. These smaller images are prepended with  “[smalldisk]” in the image title on Azure Portal. For Powershell, CLI and ARM Templates, the image SKU is appended with "-smalldisk". If your application do not require large amount of OS disk space, you would observe savings of $2.18 per VM if you choose to deploy with 32GB Standard Managed OS disk vs. 127GB. For large scale deployments, the benefit would accumulate and may represent significant cost savings. 

You can also have the flexiblity to expand the OS disk by following the existing guide for expanding OS Disk:

If you have expanded the OS Disk, log into your Windows VM and use Disk Management Tool to extend the OS partition to match the OS disk size.

Panasonic’s Toughbook 33 is designed for extreme field work

The content below is taken from the original (Panasonic’s Toughbook 33 is designed for extreme field work), to continue reading please visit the site. Remember to respect the Author & Copyright.

Let’s be real: The computers we use in our daily lives are too flimsy for seriously harsh environments like war zones or construction sites. For those who need machines that can survive those situations, Panasonic has the Toughbook range of rugged devices. The latest in the series — the Toughbook 33 — is a tablet that attaches to a keyboard, and it’s the most full-featured of its kind. Not only is it the "first fully rugged product" to use a 3:2 screen ratio for business applications (more on why that is important later), it also offers a comprehensive array of tools that will support workers in even the most extreme environments.

Panasonic touts the new 3:2 screen ratio as a very important benefit. With the height gained, people don’t need to scroll as much to get to crucial information that may be nearer the bottom of a page, such as an address, phone number or mission objectives. This saves precious time for those who are responding to urgent or emergency situations. During a recent demo, I found the 12-inch screen easy to read, although I wish it delivered punchier colors. Still, with a 2K resolution, it’s a perfectly functional display that Panasonic designed to be touch-responsive even when it’s wet or you’re wearing gloves.

I was concerned that the Toughbook 33’s dated, bulky design meant it was slower than the latest devices on the market, but its performance was on par when I used it to browse a few web pages or type some short sentences. That’s likely because it uses the newest (7th-gen) Intel Core i5 (or i7) processor with 8GB (or 16GB) of RAM and a 256GB (or 512GB) SSD for smooth multitasking, which is important when you’re dealing with critical situations or deadlines.

For work that is less time-sensitive (but just as important), the Toughbook 33 also offers plenty of helpful features. In addition to USB 3.0, microSD, HDMI, Ethernet, audio and nano-SIM ports, the tablet itself also has an optional serial socket. This lets operators connect to older devices like stop lights to run diagnostics or configuration programs. The Toughbook also has two cameras — an 8-megapixel one on the rear for onsite photography, and an infrared 1080p setup around the front that’s compatible with Windows Hello for secure and convenient logins — even in the dark.

You can choose to add one other slot to a bay at the top of the tablet, such as a fingerprint scanner, a barcode reader, another camera or an insertable smart card reader for secure credit-card scanning. There are also five physical buttons below the screen for volume control and accessing the home screen, as well as two that can be customized to launch apps or macro actions (such as repetitive automated tasks like adding a value to a new spreadsheet row). Panasonic also throws in a Wacom-made digitizer that stows away on the tablet, so you can mark up documents or scribble sketches on the go. And in case this exhaustive list wasn’t enough, there’s an SD card reader, HDMI port, VGA connection, Ethernet, serial port plus three more USB sockets on the companion keyboard.

At 3 pounds each, the keyboard and tablet weigh a whopping 6 pounds together. To make lugging that hefty combination easier, Panasonic built in a sturdy handle that you can pull out from the back of the hinge connecting the Toughbook 33 to its keyboard. I was able to comfortably tote the device by this handle for about a minute before my weak spindly arms started wobbling — it’s certainly not for the average consumer.

To keep all these tools working in harsh environments, the Toughbook is designed to meet MIL-STD-810G military standards for durability and is rated IP65 for water- and dust-resistance. The magnesium alloy body and rubbery elastomer edges certainly felt impenetrable when I played with it. Panasonic says it chose the magnesium alloy as it helps dissipate heat. Still, the Toughbook 33 uses a heat-piping system throughout the machine to keep it cool.

Under the protective case are slots for the two batteries that power the Toughbook 33, which can last for up to 10 hours altogether. The device will get energy from them simultaneously, switching seamlessly to the other pack when the first is depleted. Indicator lights will show when each is empty, so you can hot-swap it out without stopping what you’re working on.

At $4,099 for the tablet and keyboard, the Toughbook 33 is a big investment that Panasonic says its customers can expect to last between five and seven years. Its hefty price tag and footprint aren’t for everyone, but for those who need a two-in-one that can do it all in environments as extreme as war zones, the Toughbook 33 looks to be a capable and uncompromising device.

Why and how to disable SMB1 on Windows 10/8/7

The content below is taken from the original (Why and how to disable SMB1 on Windows 10/8/7), to continue reading please visit the site. Remember to respect the Author & Copyright.

Though security concerns with systems are nowhere new, the mess caused by the Wannacrypt ransomware has prompted for immediate action among netizens. The Ransomware targets the vulnerabilities of the SMB service of the Windows operating system to propagate.

SMB or Server Message Block is a network file sharing protocol meant for sharing files, printers, etc, between computers. There are three versions – Server Message Block (SMB) version 1 (SMBv1), SMB version 2 (SMBv2), and SMB version 3 (SMBv3). Microsoft recommends that you disable SMB1 for security reason – and it is not more important to do so in view of the WannaCrypt or WannaCry ransomware epidemic.

Disable SMB1 on Windows

To defend yourself against WannaCrypt ransomware it is imperative that you disable SMB1 as well as install the patches released by Microsoft. Let us take a look at some of the ways to disable SMB1.

Turn Off SMB1 via Control Panel

Open Control Panel > Programs & Features > Turn Windows features on or off.

In the list of options, one option would be SMB 1.0/CIFS File Sharing Support. Uncheck the checkbox associated with it and press OK.Disable SMB1 on Windows

Restart your computer.

Disable SMB1 using Powershell

Open a PowerShell window in the administrator mode, type the following command and hit Enter to disable SMB1:

Set-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters" SMB1 -Type DWORD -Value 0 –Force

Powershell Script
If for some reason, you need to temporarily disable SMB version 2 & version 3 use this command:



Set-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters" SMB2 -Type DWORD -Value 0 –Force

It is recommended to disable SMB version 1 since it is outdated and uses technology that is almost 30 years old.

Says Microsoft, when you use SMB1, you lose key protections offered by later SMB protocol versions like:

  1. Pre-authentication Integrity (SMB 3.1.1+) – Protects against security downgrade attacks.
  2. Insecure guest auth blocking (SMB 3.0+ on Windows 10+) – Protects against MiTM attacks.
  3. Secure Dialect Negotiation (SMB 3.0, 3.02) – Protects against security downgrade attacks.
  4. Better message signing (SMB 2.02+) – HMAC SHA-256 replaces MD5 as the hashing algorithm in SMB 2.02, SMB 2.1 and AES-CMAC replaces that in SMB 3.0+. Signing performance increases in SMB2 and 3.
  5. Encryption (SMB 3.0+) – Prevents inspection of data on the wire, MiTM attacks. In SMB 3.1.1 encryption performance is even better than signing.

In case you wish to enable them later (not recommended for SMB1), the commands would be as follows:

For enabling SMB1:

Set-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters" SMB1 -Type DWORD -Value 1 -Force

For enabling SMB2 & SMB3:

Set-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters" SMB2 -Type DWORD -Value 1 –Force

Disable SMB1 using Windows registry

You can also tweak the Windows Registry to disable SMB1.

Run regedit and navigate to the following registry key:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters

In the right side, the DWORD SMB1 should not be present or should have a value of 0.

The values for enabling and disabling it are as follows:

For more options and ways to SMB protocols on the SMB server and the SMB client visit Microsoft.



Social Engineering is on The Rise: Protect Yourself Now

The content below is taken from the original (Social Engineering is on The Rise: Protect Yourself Now), to continue reading please visit the site. Remember to respect the Author & Copyright.

As Internet security has evolved it has gotten easier to lock your systems down. Many products come out of the box pre-configured to include decent security practices, and most of the popular online services have wised up about encryption and password storage. That’s not to say that things are perfect, but as the computer systems get tougher to crack, the bad guys will focus more on the unpatchable system in the mix — the human element.

History Repeats Itself

Ever since the days of the ancient Greeks, and probably before that, social engineering has been one option to get around your enemy’s defences. We all know the old tale of Ulysses using a giant wooden horse to trick the Trojans into allowing a small army into the city of Troy. They left the horse outside the city walls after a failed five-year siege, and the Trojans brought it in. Once inside the city walls a small army climbed out in the dead of night and captured the city.

How different is it to leave a USB flash drive loaded with malware around a large company’s car park, waiting for human curiosity to take over and an employee to plug the device into a computer hooked up to the corporate network? Both the wooden horse and the USB drive trick have one thing in common, humans are not perfect and make decisions which can be irrational.

Famous Social Engineers

[Victor Lustig] was one of history’s famous social engineers specializing in scams, and was a self-confessed con man. He is most famous for having sold the Eiffel Tower. After the First World War, money was tight, and France was struggling to pay for the upkeep of Eiffel Tower and it was falling into disrepair. After reading about the tower’s troubles, [Lustig] came up with his scheme: he would trick people into believing that the tower was to be sold off as scrap and that he was the facilitator for any deal. Using forged government stationary, he managed to pull this trick off: twice!

He later went on to scam [Al Capone] out of $5,000 by convincing him to invest $50,000 into a stock market deal. He claimed the deal fell through, although in reality there was no deal. After a few months, he gave Capone his money back, and was rewarded with $5,000 for his “integrity”.

[Charles Ponzi] was so notorious the scheme he used which is alive and well today was named after him. A Ponzi Scheme is a pyramid investment scam using new members money to pay older investors. As long as new recruits keep coming in, the people at the top of the pyramid get paid. When the pool of new suckers dries up, it’s over.

The biggest Ponzi scheme ever was discovered by then-respected high flyer and stock market speculator [Bernard Madoff]. The scheme, valued at around $65 billion, was and still is the biggest in history. Madoff was so prolific he had banks, governments and pension funds invested into his scheme.

[Kevin Mitnick] is probably the most famous computer hacker still alive today, however he was more of a social engineer than you would think. Kevin started young; at thirteen, he convinced a bus driver to tell him where to buy a ticket puncher for a school project, when in fact it would be used with dumpster dived tickets found in the bins of the bus company’s depot.

At sixteen, he hacked Digital Equipment Corporation’s computer systems, copying proprietary software and then going on to hack Pacific Bell’s voice mail computers along with dozens of other systems. He was on the run for a few years and was eventually imprisoned for his crimes. Out of jail, he has turned into a security consultant and does well for himself by staying on the correct side of the law.

[John Draper], AKA Captain Crunch, was a pioneer in the phone phreaking world. He gained his moniker because of free whistles given away in packages of Cap’n Crunch cereal. He realized that these whistles played 2,600 Hz which just happened to be the exact tone that AT&T long distance lines used to indicate that a trunk line was ready and available to route a new call. This inspired [John Draper] to experiment with and successfully build blue boxes. Those days are gone now, as the phone system switched from analog to digital.

Types Of Social Engineering Scams and How To Avoid Them

There are many different type of social engineering attacks — imagine counting up the number of ways that exist to trick people. Still, it’s worth understanding the most popular scams, because you do need to protect yourself.

Pretexting

This type of scam involves telling someone a lie in order to gain access to privileged areas or information. Pretexting is often done in the form of phone scams where a caller will claim to work for some big company and needs to confirm their targets identity. They then go on to gather information like social security numbers, mother’s maiden name, account details and dates of birth. Because the call or the situation is normally initiated by the social engineer, a good way to protect your self from this scam is to call back or confirm who they say they are — using information that you gathered about the company, and not given by them.

Baiting

Dropping malware-filled USB drives around parking lots, or giant wooden horses near your enemy’s walls, is classic baiting. This is a simple attack with a simple mitigation: remember that if something free and interesting just lying around looks too good to be true, then it probably is.

Phishing

Phishing is the practice of sending out e-mails, posing as a well-known web service or company, and aiming to get the recipient to open a compromised document, visit a poisoned website, or otherwise break your own security. A few weeks ago, Hackaday’s own [Pedro Umbelino] wrote about how easy it is to exploit even the most security conscious around us (it had me) with an IDN homograph attack.

Most phishing is done at a less sophisticated level — normally a clone of website is made and emails are sent out telling victims to change their password. High value targets may have a fully customized phishing experience, known as “spear phishing”, where the scammer will put more effort into a site clone or email text by including personal information to make it look more authentic. Phishing is normally easy to spot — check the address of any link before clicking on it. And if you’re asked to change a password through an e-mail, close the e-mail and log into the web site through normal means, bypassing the bad links entirely.

Ransomware

A lot of ransomware is delivered by phishing, but since there have been an increasing number of widespread cases, it gets its own topic. However the user is fooled into running the malware on their computer, it encrypts valuable data or locks the user out of their system and demands payment to restore things back to normal. Whether this happens or not, upon payment, is anyone’s guess.

There have been a number of very high profile ransomware attacks lately, including ransomware crippling UK’s NHS and then spreading globally. Will this ever end? The easiest mitigation strategy against ransomware, in addition to no clicking on suspicious links, applications or keeping your system up to date in the first place, is to keep frequent backups of your system so that if you do get ransomed, you won’t have to pay. Keeping backups has other benefits as well, of course.

Quid Pro Quo

The quid pro quo scam is really all “quid” and no “quo”. A service provider calls offering to fix a bug or remove malware (that doesn’t exist) for a fee. A quick search on YouTube will turn up thousands of videos of scammers trying their luck with wise-cracking teenagers. As with many cons, this scam can be avoided by simply not responding to out-of-the-blue offers. On the other hand, this scam seems successful enough that it’s still being run. Knowing about it is the best defense.

Tailgating

One way to get into a restricted area that’s protected by a closed door is to wait for an employee or someone with access and follow them in. These attacks are normally aimed at businesses or apartment buildings, and the solution is to simply not let anyone get in with you.

Dumpster Diving

To impersonate a legitimate contractor, it helps to know the names of the firms involved and even points of contact inside the firm. All of this data and more can be found on receipts in the dumpster behind the firm. Invest in a shredder, and don’t leave anything to chance.

Social Media

People share an amazing amount of personal information on social media, so it’s no surprise that it’s a new tool for social engineers. Looking through someone’s account is like looking at a snapshot of someones life. Why would you announce your home is going to be empty to for the next two weeks to literally the whole world? Your home is just asking to be burgled. Or think of the ammunition that you’re giving to a would-be spear phisher. Think about the trade-offs of sharing personal information about yourself publicly.

Notable Social engineering Case Studies

Now, let’s see a couple examples of these social engineering tricks in the wild.

News International Phone Hacking Scandal

Here in the UK, there was a huge public storm when News International, owned by media mogul [Rupert Murdoch], was found to be using social engineering to “hack” into the voicemail services of prominent celebrities, politicians, royals, and journalists. The phone hacking list is extremely long. They often hacked into the voicemail by spoofing the caller ID that granted access to the phone’s voicemail inbox. Some voicemails were password protected with four-digit codes that were easily guessed. On other occasions, they simply called the phone provider’s service hotline and said they forgot their pass code — plain-vanilla pretexting.

Celebgate iCloud Nude Pictures “Hack”

[Ryan Collins] used phishing techniques to gain access to the iCloud accounts of Jennifer Lawrence, Kate Upton, and Kim Kardashian. He created fake notifications from Google and Apple and sent them on to his targets’ email addresses. At the time, there was speculation that Apple’s iCloud had hacked into on a massive scale. Instead, Collins admitted in an interview that he used phishing techniques to gain access to his victims personal data.

Where do We Go From Here

If breaking the computer system is too difficult, you can be sure that criminals will try to break the human system. Whether you call this “social engineering”, “cons”, or “scams”, they’re likely to be on the rise. The best way to protect yourself is to teach anyone with access to your data or details about how the attacks work, and how to avoid them.

There are plenty of resources online that you would be useful for helping protect yourself from these attack vectors. Protect yourself from eight social engineering attacks is quite a good starting point, and the US Department of Homeland Security also provides great information on preventing social engineering hacks that you can point people to.

In the end, most of it boils down to recognizing the patterns and being skeptical when you see them. Verify information through other channels, don’t blindly click links, and be wary of what personal details you give out to solicitors.

Microsoft will soon open its first two data centers in Africa

The content below is taken from the original (Microsoft will soon open its first two data centers in Africa), to continue reading please visit the site. Remember to respect the Author & Copyright.

Microsoft today announced that it will soon open two data center regions for its cloud-based services in Johannesburg and Cape Town South Africa. This marks Microsoft’s first data center expansion into Africa and the plan is to get these new regions online in 2018.

Like most of its other data centers around the world, these new regions will offer both Azure’s suite of cloud computing tools for developers, as well as productivity tools like Office 365 and Dynamics 365. With no data centers in the regions, developers and other Microsoft customers currently have to connect to data centers in Europe and accept the increased latency that entails.

“Few places in the world are as dynamic and diverse as Africa today,” Microsoft’s executive vice president for its Cloud and Enterprise group Scott Guthrie writes in today’s announcement. “In this landscape, we see enormous opportunity for the cloud to accelerate innovation, support people across the continent who are working to transform their businesses, explore new entrepreneurship opportunities and help solve some of the world’s hardest problems.”

The addition of these two new regions now brings Microsoft’s total number of regions to 40, significantly more than its biggest competitors. As far as competing cloud platforms go, Google currently offers its developers access to eight regions (but has a plan to aggressively increase this number over the course of this year) and Amazon’s AWS currently operates 16 regions and 42 availability zones.

It’s worth noting that neither Google nor Amazon currently operate regions in Africa, though the number of data centers in the region that are being operated by other companies continues to increase rapidly.

Featured Image: Getty Images

Introducing Google Cloud IoT Core: for securely connecting and managing IoT devices at scale

The content below is taken from the original (Introducing Google Cloud IoT Core: for securely connecting and managing IoT devices at scale), to continue reading please visit the site. Remember to respect the Author & Copyright.

By Indranil Chakraborty, Product Manager, Google Cloud

Today we’re announcing a new fully-managed Google Cloud Platform (GCP) service called Google Cloud IoT Core. Cloud IoT Core makes it easy for you to securely connect your globally distributed devices to GCP, centrally manage them and build rich applications by integrating with our data analytics services. Furthermore, all data ingestion, scalability, availability and performance needs are automatically managed for you in GCP style.

When used as part of a broader Google Cloud IoT solution, Cloud IoT Core gives you access to new operational insights that can help your business react to, and optimize for, change in real time. This advantage has value across multiple industries; for example:

  • Utilities can monitor, analyze and predict consumer energy usage in real time
  • Transportation and logistics firms can proactively stage the right vehicles/vessels/aircraft in the right places at the right times
  • Oil and gas and manufacturing companies can enable intelligent scheduling of equipment maintenance to maximize production and minimize downtime

So, why is this the right time for Cloud IoT Core?


About all the things

Many enterprises that rely on industrial devices such as sensors, conveyor belts, farming equipment, medical equipment and pumps particularly, globally distributed ones are struggling to monitor and manage those devices for several reasons:

  • Operational cost and complexity: The overhead of managing the deployment, maintenance and upgrades for exponentially more devices is stifling. And even with a custom solution in place, the resource investments required for necessary IT infrastructure are significant.
  • Patchwork security: Ensuring world-class, end-to-end security for globally distributed devices is out of reach or at least not a core competency for most organizations.
  • Data fragmentation: Despite the fact that machine-generated data is now an important data source for making good business decisions, the massive amount of data generated by these devices is often stored in silos with a short expiration date, and hence never reaches downstream analytic systems (nor decision makers).

Cloud IoT Core is designed to help resolve these problems by removing risk, complexity and data silos from the device monitoring and management process. Instead, it offers you the ability to more securely connect and manage all your devices as a single global system. Through a single pane of glass you can ingest data generated by all those devices into a responsive data pipeline and, when combined with other Cloud IoT services, analyze and react to that data in real time.

Key features and benefits

Several key Cloud IoT Core features help you meet these goals, including:

  • Fast and easy setup and management: Cloud IoT Core lets you connect up to millions of globally dispersed devices into a single system with smooth and even data ingestion ensured under any condition. Devices are registered to your service quickly and easily via the industry-standard MQTT protocol. For Android Things-based devices, firmware updates can be automatic.
  • Security out-of-the-box: Secure all device data via industry-standard security protocols. (Combine Cloud IoT Core with Android Things for device operating-system security, as well.) Apply Google Cloud IAM roles to devices to control user access in a fine-grained way.
  • Native integration with analytic services: Ingest all your IoT data so you can manage it as a single system and then easily connect it to our native analytic services (including Google Cloud Dataflow, Google BigQuery and Google Cloud Machine Learning Engine) and partner BI solutions (such as Looker, Qlik, Tableau and Zoomdata). Pinpoint potential problems and uncover solutions using interactive data visualizations, or build rich machine-learning models that reflect how your business works.
  • Auto-managed infrastructure: All this in the form of a fully-managed, pay-as-you-go GCP service, with no infrastructure for you to deploy, scale or manage.

"With Google Cloud IoT Core, we have been able to connect large fleets of bicycles to the cloud and quickly build a smart transportation fleet management tool that provides operators with a real-time view of bicycle utilization, distribution and performance metrics, and it forecasts demand for our customers."
 Jose L. Ugia, VP Engineering, Noa Technologies

Next steps

Cloud IoT Core is currently available as a private beta, and we’re launching with these hardware and software partners:

Cloud IoT Device Partners

Cloud IoT Application Partners

When generally available, Cloud IoT Core will serve as an important, foundational tool for hardware partners and customers alike, offering scalability, flexibility and efficiency for a growing set of IoT use cases. In the meantime, we look forward to your feedback!

Amazon’s new Seattle headquarters will permanently house a homeless shelter

The content below is taken from the original (Amazon’s new Seattle headquarters will permanently house a homeless shelter), to continue reading please visit the site. Remember to respect the Author & Copyright.


  • anchor

    Amazon’s new Seattle headquarters will permanently house a homeless shelter



    By anastokm

    May 16, ’17 2:30 PM EST





    A rendering of Amazon's new headquaters in Seattle. Half of the six-storey building on the right is planned to house the shelter.

    A rendering of Amazon’s new headquaters in Seattle. Half of the six-storey building on the right is planned to house the shelter.