Top News
Tools
The future of the Internet of Things (IoT) seems bright as the industry shows the signs of heading towards large-scale growth in the coming years. It is estimated that the IoT market will top $1.4 trillion by 2027 (up from $250 billion 3 years ago). As the fight against the COVID-19 pandemic heats up, businesses and governments appear to be turning their focus to IoT as the potential source of help. This technology has shown potential in aiding enforce social distancing, ensuring equipment availability, and automating tasks previously done by humans. With this massive adoption, we are likely to see a sudden surge in IoT use than anticipated before the pandemic. Here are some IoT cloud trends you are likely to see going forward.
-
Cloud Efforts Are Taking Shape This Year
Monday, 01 March 2021
-
Why You Should be Developing Cloud-Native Apps
Monday, 22 February 2021
-
IBM Eases Deployments With Red Hat Marketplace
Sunday, 06 December 2020
-
Ready Your Cloud Operations for Year-End 2020
Monday, 09 November 2020
Storage
Today’s diverse cyber threats, from ransomware, phishing and malware to rogue mobile apps place an enormous burden on information security organizations. We built RiskIQ, to provide comprehensive discovery, intelligence, and mitigation of threats associated with an organization’s digital presence, help businesses leverage the technologies and tools necessary to analyze cyber-attacks, assess risk, and take action against those digital threats.
-
AWS Releases The Open Source Library AutoGluon For AI Development
Monday, 10 February 2020
-
Containers Are An Important Tool For Developers Deploying To The Cloud, But Look Out For These Vulnerabilities
Monday, 16 September 2019
-
How the Cloud Leverages Open-Source Solutions
Wednesday, 11 September 2019
-
Managing Your Cloud Computing Costs
Wednesday, 13 March 2019
Solutions for Archiving Sensitive Data in the Cloud
Many businesses and individuals are familiar with the concept of cloud backups. All major providers as well as consumer manufacturers like Apple offer easy methods with which to back up data to cloud storage. Using the provider’s resources eliminates the need for clients to purchase hardware and keeps backups offsite where they are safely separated from the original data.
Archiving data is sometimes confused with backing it up. While both processes are used to store data, the reasons for performing them are very different. Backups are created primarily for recovering systems and data in the aftermath of unforeseen events. Backups make copies of existing data that can be used to easily replace lost or corrupt files.
Archives are used for the longterm storage of specific data elements that are needed to satisfy legal or regulatory requirements. An archive may be the only copy of the given data, making it critically important that it is stored safely and protected against possible loss. Archived data often contains sensitive or personally identifying information, making it imperative that it is stored securely to meet privacy regulations.
Differences in Data Availability
The difference in why backups and archives are created affects issues such as how they are stored and the speed at which they need to be accessed. Backups need to be readily available to address unexpected outages or data loss scenarios. Mission-critical systems are often configured to immediately fail over to backups to avoid or minimize downtime.
Archived data normally does not need to be accessed as rapidly as does backup data. Information needed to provide evidence to auditors or furnish documentation for corporate lawyers usually has less demanding time requirements. Retrieval of the necessary data can be scheduled so it is available when needed. In the majority of cases, archived data does not need to be immediately available.
Choosing a Cloud Archiving Solution
The following factors need to be considered when selecting a cloud archiving solution.
- Data security and durability - Since a single copy of important data is often archived, the provider needs to ensure the data won’t be lost, corrupted, or access by unauthorized personnel.
- Integrated compliance management - Archival data subject to regulatory guidelines needs to be appropriately managed by the cloud provider.
- Storage location - In some cases, data needs to be stored in specific geographical regions to satisfy compliance requirements.
- Data access time - Various cloud vendors provide different access times for archived data versus other types of storage.
Following are two examples of the archive offerings of major cloud providers.
Amazon S3 Glacier and S3 Glacier Deep Archive
This offering by Amazon Web Services (AWS) promises 99.999999999% data durability with comprehensive security and compliance capabilities. The cost of storage can be as low as $1 per terabyte per month. Glacier has three access tiers that range from a few minutes to several hours. Deep Archive has two options that return data in 12 or 48 hours.
Google Cloud Nearline, Coldline, and Archive
Google offers three archival solutions to address the varying needs of accessing archived data. They promise low latency and availability of archived data, with a low cost per gigabyte and 99.999999999% durability of objects over a given year. Data is protected by Google-grade security and redundant storage.
The cloud offers a viable method of archiving sensitive data without incurring the capital costs of procuring additional on-premises storage. Organizations with archiving needs should investigate how the cloud can satisfy them.
The Benefits of Creating Big Data Lakes in the Cloud
Data lakes are centralized repositories that are used to store, process, and secure large amounts of data. Data lakes can handle structured, semistructured, and unstructured data in its native format. They are becoming increasingly important to data-driven businesses looking to maximize the value of big data and enterprise information resources.
The reason behind the rising importance of data lakes is their ability to provide an analytical environment that supports multiple tools, languages, and workloads. Data lakes provide raw informational materials that can be extracted for numerous purposes including business intelligence (BI), machine learning (ML), and artificial intelligence (AI) processing.
Constructing a Data Lake
Data lakes can be built using on-premises hardware or cloud resources. There are several characteristics of cloud data lakes that make them a more flexible and effective way to handle big data resources.
Storage capacity
Data growth is one of the major challenges of managing data lakes. As new data streams are made available, capacity requirements often change. In an on-premises data lake, this entails continually monitoring capacity and purchasing new hardware when necessary.
Cloud lakes remove any worries about exceeding storage capacity. Cloud storage resources are essentially infinite and can easily be added to address evolving capacity requirements.
Compute power and flexibility
The compute and software resources of the cloud provider are available to cloud data lakes. This means the analytic engines and compute power can be used on-demand for a variety of purposes. Multiple teams can access the same data using the cutting-edge software solutions made available by the provider.
Replicating the infrastructure elasticity of a cloud data lake in an on-premises data center requires a substantial effort in planning and capital expenditures to procure the necessary hardware. Inaccurate planning can result in a lot of expensive hardware sitting around waiting to be deployed.
Cost
Costs for cloud data lakes are minimized by the “pay for what you need” nature of cloud computing. Using on-demand software tools is often less expensive than obtaining dedicated licenses. Unused hardware for erroneously anticipated compute or storage needs is a budgetary nightmare. Cloud data lakes eliminate the problems associated with purchasing unnecessary processors or storage devices.
Choosing a Cloud Data Lake Provider
All major cloud providers have the resources to furnish their customers with the resources to create a data lake. Following some simple guidelines can help ensure that you select the right provider to address the needs of your business.
- Make sure the data lake solution you choose can easily be integrated with your current computing environment. You don’t want to use incompatible systems that lead to data silos and inefficient use of enterprise information.
- Enterprise-grade security is a must in any cloud data lake implementation.
- Select an offering that your budget can afford.
- Ensure the data lake solution chosen has the capabilities of working with the type of data you plan to store in it.
Some additional management complexity may accompany housing a data lake in the cloud versus on-premises, but the benefits promise to make these issues negligible. You can provide your analysts with a horizonless data lake from which to pull unimagined insights from big data resources. Sounds like a good place to be.
Financial Services Are Finding a Safe Home in the Cloud
Banks and other financial institutions demand a high degree of security, compliance, and performance to protect their information technology (IT) resources. In many cases, the demand for security has been addressed with in-house legacy systems relying on mainframe computing power. But the growth of e-commerce and cryptocurrencies has led some financial services decision-makers to look for alternative solutions.
While cloud providers have been offering dedicated business services for years, there has been a dearth of solutions that target the needs of the financial services industry. That changed in 2020 with the announcement of the IBM Cloud for Financial Services. In March of 2021, Microsoft announced that they were also launching a Cloud for Financial Services offering. Let’s see how these offerings address the security, compliance, and performance requirements of a very demanding group of clients.
IBM Cloud for Financial Services
IBM’s offering is designed to allow financial institutions to use a transparent public cloud ecosystem to deliver innovative and personalized services to their customers.
Partnering with more than 50 independent software vendor (ISV) and software as a service (SaaS) organizations, the exacting security, resiliency, and compliance needs of the financial services community can be met. Here are some highlights of IBM’s offering.
- Customers gain access to extensive financial services industry expertise backed by IBM Promontory, a global leader in regulatory compliance consulting.
- The IBM Cloud Security and Compliance Center platform provides a dashboard from which customers can monitor their security and compliance standing to identify areas that need to be strengthened.
- IBM Cloud Security protects data at rest, in motion, and in use.
- IBM OpenPages with Watson simplifies the management of regulatory compliance using a unified governance, risk, and compliance (GRC) solution powered by artificial intelligence (AI).
- IBM Cloud Hyper Protect Crypto Services offer data encryption provided with a dedicated cloud hardware security module. Customers maintain full control and access to the encryption keys.
Microsoft Cloud for Financial Services
Microsoft is entering the cloud financial services market intending to provide clients with the ability to enhance their customers’ experience, modernize systems, and manage risk. Following are some of the specific ways Microsoft’s offering plans to address the needs of the financial services industry.
- The service strives to improve the online banking experience through personalized interactions to facilitate customer loyalty and profitability. Institutions can gain a complete picture of customer financial, behavioral, and demographic data. The tools can suggest appropriate actions tailored to individual customers.
- Self-service tools are offered through banking portals and mobile apps, giving customers more control over their financial transactions and decisions.
- Automation of the lending process will enable the activity to be streamlined while reducing errors.
- Through the use of powerful analytics and data modeling, greater insight into the regulatory and compliance aspects of financial services can be gained, minimizing the institution’s level of risk.
If history is an indicator of things to come, we can expect to see more cloud providers offering services that speak to the needs of the financial services industry. The expectation is that more financial institutions will take advantage of secure and dedicated cloud services to avoid being left behind as the industry continues to evolve.
Amazon Wins the Job as the PGA Tour's New Caddy
The PGA Tour is committed to enhancing the experience of its fans via a new partnership with Amazon Web Services (AWS). They are introducing new ways to provide PGA Tour content to their fanbase who have been restricted from attending tournaments in person due to COVID-19 safety and health protocols. These innovative options offer the Tour’s streaming audience a customizable experience that will complement the eventual return of on-course fans.
AWS becomes the PGA Tour’s Official Cloud Provider with this deal and will use its vast array of computing resources to transform the way fans interact with Tour content. AWS media services will streamline video delivery for both televised coverage and streaming viewers. Video content will be simultaneously processed, properly formatted, and distributed for viewing on multiple platforms including mobile devices.
One of the challenges of televising a golf tournament is the size of the playing field and the fact that multiple players are active simultaneously. This forces the telecast’s producers to continuously make decisions regarding what action to present to viewers. Those in charge usually focus on the tournament leaders and those in contention, with an occasional glance at spectacular shots by other players. Some golfers may not appear on a telecast at all.
Fans of the world’s top golfers will be able to follow the exploits of their favorite players with more reliable and faster performance of the “Every Shot Live” offering. Currently only available for selected tournaments such as the Players’ Championship, this streaming option makes every shot from every player available live to subscribers. It essentially enables viewers to virtually walk the course with whichever player or players they wish. In doing so, “Every Shot Live” will also allow fans to get a unique perspective of courses that many of them will never play in person.
Another feature made possible by the partnership with AWS is TOURCast, which provides fans with a customizable viewing experience that includes watching a shot from multiple camera angles. Speed rounds will enable a player’s complete round to be viewed after the competition in a fraction of the time required to play the original 18 holes.
Using Amazon Simple Storage Service (Amazon S3), the PGA Tour will build a data lake to be used by fans, commentators, and producers. They plan on housing their catalog of video, images, and audio from the 1928 Los Angeles Open to the present and will augment the data lake with live coverage from future tournaments. This tremendous volume of data will be analyzed using Amazon’s deep learning service known as ReKognition. The content will be annotated and tagged with metadata so it can be easily searched and packaged to provide new viewing offerings during live telecasts and personalized streaming.
The PGA Tour joins other sports organizations who have realized the potential for the cloud to bring innovative alternatives to their fans. It’s just another example of how the cloud will continue to transform the world of business, sports, and entertainment in the years to come. I suspect golf fans will miss the old days as much as they miss wooden drivers, which is to say not at all.
Leverage Cloud Options for Economic Recovery
Digital technologies have played a crucial role in keeping things working during the COVID-19 pandemic. It has ensured the economy and society are running through remote working and contactless payment while at the same time supporting communication and ecommerce. In 2020 alone, many businesses suffered because most of them were not prepared to handle a massive disruption occasioned by the coronavirus. However, many responded by accelerating digital transformation and ensuring operations are running and production ongoing thanks to remote work and online collaboration tools.
Cloud computing was a true game-changer during the pandemic. It enabled ecommerce and collaboration to thrive. The cloud-based platforms allowed billions of ecommerce purchases and teleconferences each day to be held. This has been a critical element that has fueled the survival of many businesses.
According to research by Global Data, more than 80 percent of business executives say that cloud computing and networks are crucial in aiding businesses to withstand the effects of the pandemic. For those in retail, for example, the cloud, whose adoption accelerated during the pandemic, has helped them bring together customers and their businesses when no in-store purchases were allowed. According to Deloitte, the online sales of grocery rose by 233 percent during the pandemic in March 2020 compared to back to the same month in 2019.
The need to respond to the coronavirus pandemic with agility has renewed the emphasis on adopting cloud-based applications. These applications will support modern technologies such as robotics, big data, and digital channels of distribution. It will also support the increased adoption of digitization post-pandemic. Expect crucial aspects of workplaces such as communications and core business applications to be moved to the cloud to help improve availability, reliability, and resilience, all of which are critical in keeping operations running in changing times post-COVID. With the computational needs expected to fluctuate from one time to another, the cloud will immediately scale to adjust to the increase or reduction in business requirements every day, which will minimize losses.
For healthcare, cloud computing and AI have become famous in the past two years in research and in helping medical professionals and researchers fight the pandemic. Researchers have resorted to cloud computing-based AI technologies to analyze vaccines and find potential cures for the disease. In the US, Europe, and Asia, these efforts have succeeded in shortening the vaccine development process. This has saved millions of dollars in time that would have been wasted and investments that would have been used in research. The advanced communication technologies will help patients to access the healthcare cloud anytime, regardless of the location. In short, cloud computing is helping build value-driven and patient-centric care.
The transport industry is also benefiting immensely from cloud computing. The rise of connected cars or buses requires massive storage of data. This means that things such as traffic management will depend on the cloud, which would have been difficult for the traditional methods to solve. The cloud-based traffic management systems will ease the coordination of traffic signals and tracking of emergency traffic cases.
Besides helping in medical, transport, and commerce, cloud computing will be beneficial for small and medium-sized businesses. It promotes operational efficiency, improves innovation, and expands financing and markets. Digital transformation is helping SMEs bring significant benefits, which are critical for local economic development.
After the pandemic, businesses will have to strictly focus on digitizing their operations because that is the right thing to do in this era. Technologies such as cloud computing will help companies become independent, reduce cost of operations, and ensure intelligent and effective service delivery.
Benefits and Risks of Cloud Migration
Almost every organization today seems to have embraced cloud computing. This has been occasioned by the advancements that have been made in this area. While the cloud has shown tremendous opportunities, there are various benefits and risks of cloud migration, that you need to consider. Here are the risks and benefits:
Benefits
- Scalability
Cloud services are offered based on demand capacity taking advantage of the pay-as-you-go model. This means that when your company grows due to seasonality, operations will not be affected. Unlike the on-site systems, that depend on the infrastructure that has already been installed, you can increase or downsize any time. For instance, when a business experience high levels of traffic in their existing plan, they can change their plan to accommodate the new traffic.
- Reduced cost
Another major benefit of migrating to the cloud is the reduced costs that can be experienced by organizations. By moving to the cloud, companies can significantly save a lot of money, especially in the long-term. Remember that the cloud does not involve purchasing any on-premise hardware or any upfront investment. Instead, you just buy a specific plan, and you are done. With this, you avoid costs such as those associated with the purchase and operating of servers, electricity costs, and management.
- Increased collaboration
In the modern business environment, collaboration is fast becoming an area of competitiveness that every organization seeks efficiency. Due to this, many companies have increasingly started investing in cloud computing to boost their collaboration efforts. Cloud can be accessed over the internet from anywhere, and employees can work together regardless of the location constraints. Documents and files and be accessed simultaneously and can also be updated in real-time.
- Security
Security is one of the biggest misconceptions of the cloud. In reality, however, one of the benefits of moving to the cloud is improved security. Cloud service providers have the best and strictest security infrastructure and expertise needed in the current era. Cloud providers make cybersecurity their main priority, making the cloud more secure than on-site systems.
Risks
- Speed
For any modern organization, speed is a highly critical factor. As such, a business must never lose speed to be efficient. Unfortunately, speed is a limiting factor for some organizations when it comes to cloud computing. If your organization has applications, databases, or software that requires fast speeds that are above average, the cloud might not meet these needs. The good thing, however, is that cloud service providers often allow testing before migrating your operations.
- Lack of expertise and adequate knowledge
Migrating to the cloud requires the right expertise and people with adequate knowledge. Without this, you will not be able to realize the full potential of the cloud, and migration itself might not be a success. Proper migration requires putting the right systems in place and the right knowledge of these systems. Although the team in your organization may have the necessary expertise on physical hardware, the cloud technology might be too complicated.
- Legal issues and restrictions
The next risk to migrating to the cloud is legal restrictions. Before moving to the cloud, one must first understand if any legal restrictions may deny them an opportunity to migrate. For example, government contractors may not be allowed to put their data in the cloud, making it necessary for them to use on-site solutions for their operations. Some of the common regulations include PCI and HIPAA. Although some cloud service providers have been certified to handle PCI and HIPAA, it still presents a significant risk that cannot be denied.
Here's What You Need To Know to Land a Job in the Clouds
Whether you are a techie or a professional in a different area that is not related to IT, cloud computing is impacting your job one way or another. This technology has been praised by tech evangelists as a solution to many business problems. It has been tipped to possess the capabilities of turning fortunes of many organizations, both small and big around. Since this technology will impact work in different ways, knowing more about it and becoming savvy can advance your career and may earn you more salary and respect.
Here is what you need to know about cloud computing if you want to land our dream job.
- Skills of innovation and vision
This is the most critical and often overlooked area by cloud computing enthusiasts and experts. Unlike what many people believe, cloud computing is more than simply changing storage from in-house to a remote location or monthly charges that come with doing so. Instead, it starts with understanding the needs of the business and the potential of this technology on your operations. You need to understand how to embrace this technology and run simulations on new promotions to find out what suits you. Compare what the technology brings and how it compares with using on-site methods.
- Possess business communication, project management, and leadership skills
The primary reason for choosing cloud over on-site is to enhance operations and improve the business in general. For this reason, anyone working in a cloud company must learn and understand critical business communication skills needed to run operations in an organizational setup. Also, you should have leadership and management skills because these are crucial in pitching and selling cloud services to the customers and managing projects across the organization. Leadership skills allow you to handle projects, both simple and complex, and provide direction in high priority and special projects. With the right business skills, you will be able to recommend the best technological alternatives to clients and evaluate developments in the cloud in comparison with the changes in the business landscape.
- Have negotiation skills
In the modern business landscape where the customer has the final say, negotiation skills are no longer an option. This is also the case in the cloud computing world where customers must be convinced into accepting to use the services of a given vendor. As an enthusiast who would like to join the industry, you must have the right skills. As an employee, you will be responsible for negotiations, managing relationships with the customers, tracking issues, troubleshooting, and reporting various things. These skills are crucial in explaining to the customer service-level agreements, answering questions on service interruptions, and asserting the advantages of investing in your company’s solutions.
- Planning skills, and be analytical
Cloud computing solutions are always required to address the needs of each business. This can be achieved by developing flexible solutions that will fit into the needs of each customer regardless of the industry. As a person planning to find a job in a cloud service providing firm, you must have analytical, planning skills and architectural skills, which will enable you to understand problems that a customer needs to be addressed and come up with a viable solution. The same skills will allow understanding of the business needs and translate them into technical requirements.
- Be technically proficient
This is the last and the most critical skill you should possess to land a job in a cloud company of your choice. Every professional who consumes, manages, or provides cloud services must be technically savvy. You must be proficient in areas such as software engineering and specifically in the development of cloud applications. You must also be knowledgeable in network administration and management and, at the same time, possess cybersecurity skills.
Cloud Computing will be the Great Enabler of Mobile Robotics
Though only in its nascent stages, the value of cloud infrastructure to robots is key for both deployment (encompassing development, configuration, and installment) and operation (maintenance, analytics, and control). With the popularization of mobile robotics in a wide range of verticals, it will become necessary to utilize the computing power of cloud infrastructure to store and manage the vast troves of collected data as well as to train more advanced algorithms used to power robot cognition. ABI Research, a global tech market advisory firm, forecasts the robot-related services powered by cloud computing will reach US$157.8 billion in annual revenue by 2030.
“Since 1961, most commercial robots have been wired or tied to external infrastructure for movement. The next generation of robot deployments will be increasingly mobile, tied to cellular and WIFI connectivity, will consume vast troves of data in order to operate autonomously, and will need effective management through real-time measurements for performance, status and operability,” said Rian Whitton, Senior Analyst at ABI Research. Several cloud service providers, including AWS, Microsoft Azure, and Google Cloud, have begun collaborating with robotics developers, while start-ups like InOrbit target cloud-enabled operations for the first major deployment of mobile service robots.
“The journey of the robot industry from one of individual vehicles and units, to fleets and larger systems, is being driven by its wider incorporation into the IoT ecosystem. However, it would be a mistake to suggest robots will simply fit in with devices, individual sensors, and stationary machines as part of the wider IoT ecosystem,” Whitton points out. Robots are increasingly sophisticated systems themselves, with multiple sensors and highly advanced Artificial Intelligence (AI)/Machine Learning (ML) competencies, and are also expected to move around and act within the world, generating huge amounts of data relative to other machines. “To suggest the cloud alone can provide the computing power to operate these machines is naïve, especially during the slow transition to 5G. Onlookers should instead conceive of adaptable edge-cloud systems that focus on quality over quantity when it comes to robotics operation, data processing, and analysis,” Whitton adds.
The cloud robotics opportunity, defined as Robotics-as-a-Service (RaaS) and Software-as-a-Service (SaaS) revenue for robotics operations combined, will grow from US$3.3 billion in 2019 to US$157.8 billion in 2030, accounting for 30% of the robotic industry’s total worth. On its own, this represents a huge opportunity for start-ups, many of which are beginning to expand on their mission to enable developers to accelerate their go-to-market strategy and to help end-users and operators’ access and manage the ever-increasing fleets of robots. This new robotics ecosystem will be dominated by three subcategories of companies, namely robot developers that move up the value chain and become solution providers, third-party IoT and cloud platform providers focused on best-in-class software solutions, and Cloud Service Providers (CSPs) like Microsoft Azure, Amazon Web Services (AWS), and Google Cloud. Those focusing strictly on hardware will lose relative worth and will require partnerships or bold strategies to become solution providers. This can be exemplified by companies like Universal Robots and Fetch Robotics, who have incorporated software and maintenance services into their offering.
“The market is incredibly nascent at present. ABI Research expects consolidation with the most successful robot solution providers and the CSPs expanding their relative influence on the market to take place within the next decade,” says Whitton. The cloud robotics technology is split between vertical innovations, such as developing superior navigation systems, which increase the possibility of what robots can do, and horizontal innovations that expand access and scalability. “Cloud computing represents the most important horizontal innovation for the robotics industry, to date, and will further enable vertical innovations like swarm-based intelligence, autonomous mobility, and advanced manipulation to be deployed at scale,” Whitton concludes.
These findings are from ABI Research’s Cloud Robotics application analysis report. This report is part of the company’s Industrial, Collaborative & Commercial Robotics research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Application Analysis reports present in-depth analysis on key market trends and factors for a specific technology.
Cloud Video Security
Cloud security video storage can provide an edge, reports Security Magazine.
Video storage is an important consideration in any surveillance project while simultaneously being one of the most overlooked. Let’s face it: storage does not exactly provide the “wow factor” of analytics or 4K image quality, but it is the backbone on which entire video security systems are built.
Read the article on Security Magazine
Cost Control in the Cloud
According to Forbes, the biggest problem with the cloud isn’t security, it is cost.
I regularly run into clients looking at customer relationship management systems who say they have concerns about the cloud. Their biggest concern is always about their data.
Read the article on Forbes
Migration is the Biggest Risk for Cloud Companies
According to Cloud Wars, the biggest risk for cloud vendors is helping their customers complete their migrations to the cloud.
As more businesses move their on-premises databases to the cloud, many face a challenge that may take months or even years to work through.
Read the article on Cloud Wars
Cloud Security Falters Under Heavy Load
The quick migration to the cloud caused by the pandemic caused an increase in security incidents, reports Tech Republic.
Based on internal data, Unit 42's latest "Cloud Threat Report" found that organizations increased their cloud workloads by more than 20% between December 2019 and June 2020. Along the way, cloud security incidents rose by 188% just in the second quarter of 2020.
Read the article on Tech Republic
How to Obtain FedRAMP Approval for Government Cloud Services
Taking advantage of the benefits of cloud computing is not restricted to organizations operating in the private sector. In the United States, local, state, and federal governments make extensive use of public cloud resources. As is customary for all types of government contracts, some assessments and approvals must be obtained before the offerings of a cloud service provider (CSP) can be used by specific agencies. Government contracts can be very lucrative and service providers who want a piece of the business need to demonstrate that their products meet all requirements surrounding security and functionality.
All CSPs that wish to do business with the U.S. federal government need to be assessed and approved by the Federal Risk and Authorization Management Program (FedRAMP). The program’s goal is to protect the data of U.S. citizens when it is in the cloud and is the most rigorous security framework in use by the government.
FedRAMP was created to address the problem of different and potentially conflicting requirements for each agency working with cloud providers. FedRAMP provides standard security baselines and processes that simplify the process of obtaining cloud services for both providers and government agencies. Once a CSP achieves FedRAMP approval for an offering, it is listed in the FedRAMP Marketplace to gain visibility across the government.
Navigating the FedRAMP Authorization Process
CSPs that want authorization to provide services to federal agencies need to follow a process comprised of three complementary phases.
In the pre-authorization phase, CSPs should complete FedRAMP training which includes modules that define the baseline security plan. Education can be accessed via online courses, webinars, or in-person training events. A request from the CSP will result in a consultation with government subject matter experts set up by the FedRAMP Program Management Office (PMO). To successfully get through this phase of the authorization process, a CSP needs to:
- Document agency interest in their offering and establish partnerships with agency customers.
- Establish a partnership with an approved third-party assessment organization.
- Ensure that the service implements the required security controls.
During authorization, a CSP is responsible for developing a package that includes the completion of the System Security Plan. The plan is then assessed by the third-party assessment organization and findings are presented to the CSP for remediation. When all risks have been successfully addressed, the CSP attains authorization and status as a FedRAMP authorized vendor.
In the post-authorization phase of the process, the CSP is required to provide monthly monitoring deliverables to the agency using its service. Failure to provide these documents can result in the service losing its authorization.
The purpose of FedRAMP is to eliminate any confusion regarding the ability of individual agencies to use cloud services. By publishing authorized services on the FedRAMP Marketplace, the authorization process only needs to be done once for each offering. Once approved, it can be used with confidence by any federal agency that wants to use the service.
This appears to be an example of government working efficiently by reducing the duplicate work that would ensue from individual agencies or departments authorizing CSPs. In a subsequent post, we will take a closer look at the FedRAMP Marketplace and the agencies that use its authorized services.
The Increased Use of Custom Hardware in the Cloud
The thirst for convenient and scalable computing power has led many organizations to make use of cloud delivery models. This has resulted in a booming worldwide market for cloud computing that is expected to top $350 billion by 2022. With this kind of money in play, the competition to attract customers among cloud providers is fierce.
Artificial intelligence (AI) and machine learning (ML) are two transformative technologies that are gaining traction in many areas of scientific research and business analytics. The cloud offers a platform from which organizations of all sizes can access the level of computing power necessary to conduct research and make effective use of these disciplines. Consequently, major cloud providers have bolstered their portfolios with cutting-edge products designed to attract companies interested in AI and ML.
Developing new solutions or delivering advanced functionality to popular platforms is one way a vendor can stand out against its rivals. A method of gaining a competitive advantage that is enjoying increased popularity among large providers is the use of custom-built hardware that is designed to optimize their software offerings. Let’s take a look at some of the hardware solutions and innovations that cloud vendors are employing to provide enhanced capabilities to their customers.
Google’s tensor processing units (TPUs) are a prime example of hardware dedicated to processing the workloads demanded by AI applications. These are custom-built chips designed specifically to provide the power required by AI systems. The chips are often favored over more traditional graphics cards for the higher speeds they offer.
Google has developed an infrastructure option called Cloud TPU Pods that use server racks packed with TPUs. They are configured with either 256 or 1,024 TPUs and the larger configurations offer speeds that approach those of supercomputers. The company powers its popular search engine and Google Translate with TPU pods.
Amazon Web Services’ Graviton processors are custom built by the company using 64-bit ARM Neoverse cores. The processors offer cost-efficient scalability for general-purpose, compute-optimized, and memory-optimized EC2 instances. Second-generation Graviton processors provide an improved price-performance ratio over x86 chips for diverse workloads such as video encoding and CPU-based machine learning. Security is enhanced with always-on 256-bit DRAM encryption, allowing developers to run native cloud applications securely.
Microsoft’s Project Olympus is an open-source solution that incorporates hardware and software modules to create a holistic rack architecture. Project Olympus is part of the Open Compute Project which is releasing technical information to the developer community that may help even the playing field in the server industry.
The schematics offer a well-defined starting point upon which manufacturers can build custom hardware solutions. Servers built under the auspices of Project Olympus will be infused with high debugging and testing capabilities that will help isolate intermittent and hard to duplicate problems.
As the computing world’s requirements for faster and more efficient processing continues to grow, the cloud-tech giants will continue to evolve their solutions to meet the demand. The immense resources they wield ensure that when the need calls for it, the financial cost of employing custom hardware will not be an obstacle.
Cloud Sticker Shock Is a Thing
The coronavirus pandemic has altered the strategies of many companies, including their spending. This change has tipped the scale in favor of remote technologies, that are now being adopted massively as work turns from office to working from home. With the revenues of many businesses now suffering from the virus, many organizations are trying their best to limit costs as much as possible. This has seen most of them opting for remote work solutions, changing shifts, and putting some staff on unpaid leave to reduce the cost of business. With all these, the cloud appears to be seeing an increase in expenditure as the acquisition of cloud-based solutions increases. Cost is always a major driver for many migrations, but most companies poorly understand it in the beginning.
For startups, it is okay for them to see the value of cloud when they have used non in the past, but when they are still using old systems and legacy software, finding out the potential cost of moving to the cloud is not that easy. However, with the speed of modern business and the rapid response to the coronavirus pandemic, that is aimed at enabling millions of employees to work remotely, it is now clear that living in the world without cloud technology will not be easy in the modern era.
No time in history has exposed many industries than what we are experiencing today. The pandemic has shown the need for IT resources, that are always available led by the cloud. Even with the virus affecting most operations, the cloud has continued to show the reason why people are finding it useful for businesses. As the damage of coronavirus continues affecting office-based operations in different companies, cloud computing is turning out to be the refuge that these businesses look for at such a time of need. Companies are now relying heavily on technology to keep going. According to a report by Flexera, more than 20 percent of companies spend more than $1 million every month on the cloud. This is expected to double in the next 12 months, where projections found that companies will spend up to 47% on average to grow cloud. The report by Flexera based on a survey of 750 decision-makers and users also point out that companies are 23% over the budget. The respondents, however, noted that they are wasting 30% of their cloud expenditure.
Even with the pressure from the coronavirus, companies must devise a way of managing their cloud expenditure. There is a need for a balance between responding to the effects of the pandemic and keeping the expenditure reasonable. Although spending on all other IT categories will go down this year due to the suspension of operations, cloud technologies will still make the most. Companies will need a way of supporting their on-premise resources and improve the working experience of remote employees. The problem, however, is what is needed, and for how long since the time that the pandemic is expected to be over is not known. However, companies must use data from mid-March up to now and project what is required and what is not. Also, this information can help the forecast of what goes up and what will go down.
With the costs going up, companies must employ new tactics to stay on top of the expenses, most of that are unpredictable. Despite the cost management being challenging, as shown by Flexera because of the complexity and difficulty in usage forecasting, businesses should consider monitoring their spending usage and spending daily. They should also ensure that unused workloads are taken down to reduce the cost of operations in such a time when spending needs to be limited as much as possible.
Despite Covid-19, Streaming Services Show No Sign of Slowing Down
As many industries reevaluate their go-to-market strategies amid the pandemic - not every business is suffering. Take for instance streaming video on-demand (SVOD) subscriptions like Netflix and Disney+ who have seen an increase in viewership as people are mandated to stay home due to Covid-19. But will streaming services see a depletion of subscribers once people venture outdoors again and are all SVODs doing well in this time of uncertainty?
Where other SVOD services have failed to grow (I’m looking at you Quibi), Disney+ wasted no time -amassing nearly 55 million subscribers in just six months. The streaming service that rolled out in November already has more subscribers than Hulu who clocks in at around 32 million.
Netflix which is still the leader in streaming services has a whopping 182.8 million subscribers. During their earnings call in April- Netflix reported that they had 23% more subscribers than this time last year earning $5.7 billion dollars in revenue. While many analysts wonder if these new subscribers will stay once the pandemic is over – at least one Bank of America analyst doesn’t think they should be worried. Nat Schindler told Yahoo Finance, “We anticipate the step-up will result in a permanent increase in penetration for Netflix's subscriber model and see its low price-point and staple nature supporting healthy fundamentals performance in a recession, even after stay-home orders are lifted.”
While many SVOD services have seen a rise in viewership – what happens when the pandemic is over? Does it really make sense for a family of four to keep subscriptions to all of the available services? The market is getting tight and eventually families will have to make sacrifices as to which streaming services to keep.
In a survey of 1,000 viewers who have more than two SVOD subscriptions, FLIXED found that out of the main streaming services (Amazon Prime, Netflix, Hulu, Disney+, and Apple TV+), viewers would most likely drop Apple TV+ over any of their other memberships. Like Disney+ - Apple TV+ debuted in November 2019 but unlike Disney+ met little to no fanfare. The platform that costs just $4.99 doesn’t seem to resonate with viewers as 36% of those surveyed said that they would most likely not renew their subscription to Apple TV+. While Apple has yet to reveal how many subscribers they actually have – analysts put the number at around 30 million – but many of those subscribers are getting it free for the first year. Analyst Toni Sacconaghi of Berstein told the Financial Post that one of the reasons Apple TV+ has fallen short is because it was "failing to resonate with customers, perhaps due to its limited content offerings."
Like Apple TV+ - Quibi – a much hyped streaming services that delivers shows that are 10 minutes or less to a subscriber’s phone has also failed to resonate with audiences. Debuting in March- Quibi only has 3.5 million subscribers (1.7 million who are considered “active”). While many could blame the series of blunders (not allowing users to share clips online, only available for Android phones)- owner Jeffrey Katzenberg blames the pandemic – the same pandemic that is helping other SVOD services excel. "I attribute everything that has gone wrong to coronavirus,” he told The New York Times.
As Covid-19 continues to keep people at home – subscribers will flock to Streaming Video On-Demand services for entertainment. In a convoluted market – not all streaming services will succeed. If the current number of subscribers are any indication - Netflix will continue to reign while SVODs like Apple TV+ and Quibi could be in big trouble.
Popular Articles
- Most read