Wednesday, 28 February 2018

Simple tips on to how to monitor the Server

Server


If you are fresher as a Server or System Administrator then you must be wondering and looking out exactly to what do we need to monitor. To that question or query, the answer will be that it depends on the application that the server is operating so there are going to be mission critical services that you have to monitor 24/7. There are few applications that will need built-in Operating System services to run in addition to the services and some will need their own services. Set of basic components will be monitored regardless of what application the server is operating. 

CPU


Monitor Central Processing Unit:

The CPU is the main component of the Server Hardware. The Central Processing Unit is the part of the computer System that will perform certain instruction of the computer program to carry out the basic arithmetical, input/output and logical operations of the system. CPU system that has tabbed onto 100 percent for several minutes or hours indicates that it is an unhappy server. This means that the server will be no longer to fulfill additional request even though if there are mission critical or not. You have to either add more CPU’s, upgrade or shut down the non-serious services that are controlling the critical resources. There is no certain number that the CPU usage should be certain in a particular way but if the CPU is 75 % or higher then you adopt one of the suggested steps mentioned above. 

RAM


Monitor Random Access Memory:

RAM is a part of data storage where the server can upload the data needed by specific applications into RAM for faster access thereby enhancing the overall performance of the application. RAM is a flash-based storage and it is faster compared to the slower hard disk. If the RAM runs out on the server then it configures a part of the hard drive as the virtual memory and space is reserved for CPU usage. This is mainly called swapping which degrades the performance because the hard drive is much slower than RAM. The overall performance of the server degrades when swapping is done because it contributes to file system fragmentation. If the RAM is full or rising then try adding more RAM because that is a cheap way to boost the performance of the server. 

Hard Disk


Monitor Hard Disk:

Hard Disk is where the server utilizes to store data which it includes several rigid rotating disks that are coated with a magnetic material with magnetic heads organized to read and write from the platter or disc. The data stored is permanent so even though if there is a reboot it will survive unlike RAM and it is also non-volatile which is available till the time the owner consciously erased by the end user. It is essential to monitor the hard disk because the operating system requires space on the disk for normal operating processes that consists certain caches including the paging files. The application operating on the server will also require space to write temporary data to cache for productive operation and the data that will be accessed by the user. Sometimes there is overload on the usage of hard disk because of multiple functions performed. So that incurs overload on the hard disk which will reduce the free space on the drive which is one of the reasons for file system fragmentation which causes performance issues.

Hardware Issues and Performance:

The Hardware components that must be monitored are CPU Fan, Power Supply, Temperature, Environmental, CPU Hardware status, CMOS BatteryChassis intrusion detection and Disk array health. You will need to check the CPU Fan because it cools down the server by expelling the head so if it fails then it will overheat the server by causing serious damage. You will need to monitor the wattage, amperage, and voltage of the power supply because the Power supply unit transforms the AC to low voltage that regulates the DC power for the internal component of the system. There are many other factors that have to be monitored so that any error can be prevented. 


Apache Hadoop 2.8.3 and Presto Integration with EMRFS is been added support for Amazon EMR Release 5.12.0

Apache Hadoop 2.8.3, Apache Flink 1.4.0, Apache HBase 1.4.0, Presto 0.188, and Presto Integration with the EMRFS on Amazon Elastic MapReduce release 5.12.0. You can also utilize the upgraded versions of Hue (4.1.0), Apache Phoenix (4.13.0) and Apache MXNet (1.0.0). Apache Hadoop 2.8.3 is the first version of the Hadoop 2.8.x supported on Amazon Elastic MapReduce and includes bug fixes and added improvements to HDFS and YARN. Presto has added support by utilizing the Elastic MapReduce File System to access data in the Amazon S3. 

Amazon Web Services Sign-On adds supports for AWS Command Line Interface Access

Amazon Web Services Sign-On can be now utilized to allow users to find and get AWS Command Line Interface credentials for various AWS accounts more conveniently. Users will now able to sign in to the Amazon Web Services Single Sign-On user portal with their existing corporate credentials and get Amazon Web Services Command Line Interface credentials for all the assigned AWS accounts from one place. AWS Command Line Interface will automatically expire after 60 minutes to protect the access to the AWS accounts. 

Oracle Patch Set Updates January 2018 (PSU) is now available for Amazon Relational Database Service for Oracle

Oracle Patch Set Updates include crucial security updates and other essential updates. The Patch Set Updates (January 2018) are available for Amazon Relational Database Service for Oracle. You can visit here at Amazon Relational Database Patch Update Documentation to learn more about the Oracle PSU’s supported on the Amazon RDS. Amazon Relational Database Service makes it convenient to operate, scale and configure Oracle Database Deployments in the Cloud. 

Tuesday, 27 February 2018

Import model from Amazon SageMaker into AWS Deep Lens with the latest update

Amazon Web Services DeepLens customers can now directly import models from the Amazon Sage Maker into AWS DeepLens with the latest update. This will enhance the integration which will make model deployment much more efficient by combining the model import process down to the single click instead of earlier multi-step workflow procedure so that you can start to create predictions even faster. Amazon Sagemaker is a managed service that enables data scientist and the developers to easily and quickly build, deploy and train machine learning models at any scale. Amazon SageMaker eradicates all the difficulties that will slow down the developers to use machine learning. 

Amazon WorkDocs Web Application is been redesigned for improved collaboration experience

The Amazon WorkDocs Web Application is redesigned to build more productive collaboration experience for users. The new user interface will enable users to organize their files much better and it will be easier to share content with others. Users can now access the common features like unlocking, locking and editing files, sharing and viewing favourite content with just a single click. They can also receive insights how the folders and files are being utilized and control over the access to their content easily and quickly. My Account panel provides users to central place to manage account and profile settings. 

AWS Cost Explorer API enables the users to access the Reserved Instances Coverage Information

Reserved Instance coverage keeps a track over the numbers of running instances hours that are been covered by RIs which will allow you to understand the Reserved coverage at a high level or provides detailed information by filtering for a particular set of regions, accounts, tags, instance types and more. From there you can utilize the insights collected from the data to optimize Reserved Purchases for cost efficiency. The AWS Cost Explorer API is a low latency query service that offers programmatic access to the AWS Cost Explorer dataset which consists of usage data, cost allocation tags, cost as well as advanced metrics. 

Monday, 26 February 2018

Network Load Balancer has now added support for Cross Zone Laod Balancing

Network Load Balancer distributes request indifferent of the Availability Zone with the support of the cross-zone load balancing. This new feature will enable Network Load Balancer to link incoming request to the application that is deployed on multiple Availability Zones. Network Load Balancer depends on the Domain Name System to distribute request from clients to the Network Load Balancer nodes that deploy at multiple availability zones. Earlier the network load balancer routed the request to a target located in the same availability zone. Some targets can receive a disproportionally with a high number of inbound request where the clients can cache the DNS information of the zonal network load balancer. 

Amazon Web Services Elemental Media Package available in EU Paris and Frankfurt Regions

AWS Elemental Media Package is a just in time packaging services and video origination that will enable video distributors to reliably and securely provide streaming content at scale. AWS Elemental Media Package service functions as a part of the AWS Media Service with the foundation of cloud-based video workflows by offering you the abilities to create, deliver and package video or independently. This service creates video streams from a single video input formatted to play on connected mobile phones, tablets, TVs, game consoles, and computers. It makes it convenient to apply popular video features which are mostly found on DVRs such as rewind, pause and start over. 

Amazon Web Services AppSync now adds API Key Extension Feature and has expanded three new regions

Amazon Web Services Appsync Preview has now introduced API key duration and has also extended its regional availability in three new regions which are EU West (Ireland), AP Southwest (Sydney) and AP Northwest (Tokyo) regions. These new regions are in the addition to supported regions of US West -2, US East -1 and US East -2. The Amazon Web Services AppSync Preview supports the API Key duration that can extend to up to 365 days where the customers can now create more than 10 API keys.  

Friday, 23 February 2018

AWS announces the general availability of Serverless Application Repository

The Amazon Web Services Serverless Application Repository allows the user to share, discover and deploy serverless applications for wide range of use cases which will make it quick and easy to get started with the Serverless computing in the AWS cloud. The Serverless Application Repository provide a group of serverless applications for popular use cases like Alexa Skills, Media Processing, Stream processing, logging and monitoring and much more from publishers like TensorIoT, Data Dog, Splunk, Here and Serverless developers across the world. You can easily find and deploy application utilizing the Amazon Web Service Lambda Console, AWS SDKs, and AWS CLI and you can also continue to utilize the existing AWS and third-party tools to manage the resources that are deployed. 

With the New Quick Start you can now deploy CloudStax Cache for Redis on Amazon Web Services

The new Quick Start automatically deploys the cloud tax Cache for Redis in the Amazon Elastic Container Service cluster on the AWS Cloud in 30 minutes. CloudStax Cache is in-memory data store service for Redis that will be convenient to configure, scale and manage Redis on Amazon Web Services. CloudStax Cache for Redis eradicates the difficulty associated with the management and deployment of Redis. It delivers scalable, cost-effective and high-performance in-memory database or cache solution that you will be able to utilize to enhance the performance of the applications. CloudStax Cache operates Redis in the container on Amazon Web Services. Amazon Elastic Container Service is utilized by the deployment of containers orchestration and CloudStax fireCamp for Service management. 

AWS CodeCommit has added support for Editing and Creating Files through SDKs and Console Editor

In Amazon Web Services CodeCommit you can now add edit and add files in the AWS CodeCommit Repository via the service console, SDKs, and CLI. Earlier you had to install and set up the Git client to edit or add a file to the repository. Now you can swiftly edit and add a file to the AWS CodeCommit repositories by utilizing the browser. AWS CodeCommit handles the source control service that makes it convenient for companies to host highly and secure scalable private Git repositories. 

Thursday, 22 February 2018

Why is Cloud Computing the Best Decision you will make for your Business?

cloud computing


Cloud Computing is computing that is based on the pure usage of the Internet on Cloud Platform. Earlier, people normally use to operate their applications or software’s on the physical server or computer on premise. But with Cloud Computing it enables the user to access the applications and programs via the Internet which is stored on Cloud. When you are updating the Facebook status you are in fact utilizing Cloud Computing Service. When you are checking your bank balance then you must be wondering how everything does just reflect on your desktop or mobile in just fraction of second then Cloud Computing is the Answer. Basically, whatever you do on Internet, Cloud Computing is mostly in the Play. 

Now at least 60 percent of the businesses in the world are using Cloud Services and other 30 percent are thinking of moving to the Cloud and the remaining 10 percent are still struggling using on-premise Servers. Most of the businesses are moving to the Cloud because it increases efficiency and is very cost-effective means. The following are the Benefits of cloud computing that will help you understand how cloud computing becomes essential for business these days.

360 Security


Security:

No matter what debate people go through about cloud computing the first thing pops into their mind is that how secure is Cloud Computing. For example, if an employee loses its Company’s Laptop where sensitive data are stored on it and there is no copy of the data on the cloud or either on any storage device. I am sure that will be a very frustrating situation but if that data was stored in the Cloud then that person wouldn’t have suffered so much loss. When the data is stored in the cloud then you can access it from anywhere even though if anything happens to the desktop or the laptop. You can also delete the data remotely before it gets into the wrong hands so with cloud computing your data will be much more secure and moreover, you can also get yourself out from any disaster as such. 

Cost Effective


Cost-Effective:

Every Business wants to save money so that they can fuel their future plans for their business. With cloud computing, you will be saving a lot on Expensive hardware that will be required to store your data on-premise server. You just need to pay for the service that you use so that you don’t have to pay for the services and storage that you haven’t utilized. Installing hardware is very tricky but with cloud computing, it is very easy because it’s user-friendly. 

Flexibility


Flexibility:

Business who have to fluctuate or grow their bandwidth according to the inflow of the traffic then they should opt for cloud services. With the cloud, it is very convenient to increase or decrease the capacity and storage according to the flow of operating. You can scale your cloud up or down easily on cloud services remote servers. Cloud computing provides the flexibility with scalability and availability which can leverage the business over the competitors.

Competitive Business


Competitive:

Earlier only the business that has more capital could surface at the top of their market but as cloud computing has become available for small business to large scale businesses it has now given hope that even small business can strive hard to come at the top on the ladder. So the large business competitors are encouraged to strive to work harder in giving their customer quality service so this is, in turn, increase competitiveness and also further benefits the community.


Disaster management


Disaster Management:

All business and retailers have a disaster management plan but there are few who don’t and over the years they follow up with great loss. So we learn that disaster management is very essential for business thus large-scale business invests their money and expertise in handling future possible disasters. Well, small business doesn’t have that kind of capital to invest but cloud computing has made it possible for the small business to pick up that trend so that they can save time in recovery of the data, eradicate third-party expertise and avoid large upfront investment. 

Security Updates


Automatic Updates:

Servers that are on-premise require daily supervision so you may have to keep a system administrator to look out for updates and any errors but cloud computing you need not do that. You don’t have to worry about any updates because cloud computing does that for you by updating your software’s and security. You can focus on the business in this way and less time on the server.

Team Effort


Integration:

If you want to share your application or software with other teams then this can be possible with cloud computing. For example, G Suite is a cloud service where it provides different services which enable team members to edit and view documents so this builds up Team integration. 

Work from home


Remote:

The main benefit of cloud computing is that you can do your work from anywhere at any time without being confined to particular place. With the traditional server you can’t work remotely thus you are just confined to a place so this makes more rigid and less flexible. A business that offers flexible working trend can actually increase the productivity because this will allow them to balance between life and work.

As you have read the benefits above about cloud computing you must have come to know how important it is for the business to adopt cloud. But cloud computing is not only limited to the benefits that are stated above there are more benefits that cloud computing offers relating to scalability, latency, availability, performance and much more. 

Amazon cloud computing is the lead market in providing cloud computing services by providing storage, machine learning abilities, content delivery network, compute, developer tools, management tools, analytics, security, Augmented Reality and Virtual Reality, Application Integration, Customer Engagement, Business Productivity and much more. You can leverage your business to skyrocket by increasing the performance and reducing the expenditure. Adopting Cloud will be the smart decision because Technology is taking over this era so business getting updated to the latest trend in the business moving to success. 

Amazon Elastic Cloud Compute AutoScaling has added support for Service Linked roles

Amazon EC2


Amazon Elastic Cloud Compute Auto Scaling supports for utilizing AWS Identity and Access Management Service Linked Roles. This is a new type of role that will enable to conveniently authorize permission to Amazon Web Services. Elastic Cloud Compute Auto Scaling Service Linked roles are already defined by the Elastic Cloud Compute Auto Scaling and it contains all the permission that the service will need to call other AWS service on your behalf. 

Some of the actions that Elastic Cloud Compute Auto Scaling performs on your behalf are that it terminates and launches Elastic Cloud Compute Instance or it creates Amazon CloudWatch Alarms when you generate a target tracking scaling policy. 
Elastic Cloud Compute Auto Scaling will automatically build a default EC2 Auto Scaling Service linked role in the account if the users that don’t exist when the EC2 Auto Scaling group is created. You can also build a service linked role other than the default via IAM and pass it to the Elastic Cloud Compute Auto Scaling Group. You cannot delete the service linked role unlike the normal IAM role if it is still in use by the Elastic Auto Scaling groups. 

This will protect the user from inadvertently revoking permissions required by the Elastic Cloud Compute Auto Scaling. It will also help the user with the auditing requirements and monitor the AWS CloudTrial by logging actions by the Elastic Cloud Compute Auto Scaling again the specified Service Linked role. 

AWS OpsWorks Stack Events source is available in Amazon CloudWatch Events

Amazon Web Services OpsWorks Stacks Event Sources is available now in Amazon CloudWatch Events which can be utilized to trigger events or notifications. Earlier you had to ensemble the calls for various APIs to trigger events in response to AWS OpsWorks Stacks instance, command state changes or deployment. You can trigger an even when the instances changes from the online to stopped state or when the application deployment succeeds. You can also utilize Amazon CloudWatch Events to send alerts so that you can get notified of the potential issues. 

Inter-Region VPC Peering has added 9 more AWS Regions

AWS Inter Region VPC Peering


Inter-Region Virtual Private Cloud Peering has now added more AWS regions which is Canada (Central), Asia Pacific (Mumbai), Asia Pacific (Sydney), Asia Pacific (Singapore), Asia Pacific (Tokyo), EU (Ireland), EU (Paris), South America (São Paulo) and AWS EU (London). These regions are an addition to AWS US West (Oregon), US East (Ohio), US West (Northern California) and US East (Northern Virginia) Regions. 

The Inter-Region Virtual Private Cloud enables Virtual Private Cloud resources like Amazon Lambda function, Amazon Elastic Cloud Compute Instances and Amazon Relation Database Service to communicate with each other by utilizing the private IP addresses without requiring separate physical hardware, gateways, and Virtual Private Network Connection. It delivers a cost-effective and a simple way to share resources between the regions or replicate the data for geographic redundancy. 

Inter-Region Virtual Private Cloud Peering is built on the same horizontally scaled, a highly-available and redundant technology that encrypts Inter Region traffic with no bandwidth bottleneck or no single point of failure. Traffic utilizing the Inter Region Virtual Private Cloud Peering always stays on the Global Amazon Web Services Backbone and never crosses over the public internet which will, in turn, decrease the threat vectors such as the DDoS Attacks and Common Exploits. Data transferred via Inter-Region Virtual Private Cloud Peering Connections is charged at the standard Inter-Region data transfer rates. 

Wednesday, 21 February 2018

Amazon Inspector has added support for Windows Server 2016

Amazon Inspector now allows the users to operate security assessments for Common Vulnerabilities and Exposures. The users can also operate the Runtime Behaviour Analysis against the Amazon Elastic Cloud Compute Instances running on the Windows Server 2016. To run the security assessments you just need to install the Amazon Inspector Agent on the preferred Elastic Cloud Compute Instances by setting up the assessment in the Inspector Console and run the assessment. Amazon Inspector is an On-demand security assessment service that allows AWS customers to validate their security configuration of the operating system and the applications deployed in the Amazon Elastic Cloud Compute Environment. 

Amazon Web Service's Trusted Advisor is offering S3 Bucket Permission Free

Amazon Simple Storage Service Bucket Permission Check is made free by Amazon Web Service’s Trusted Advisor which helps the customers secure their data. Earlier it was only available to Enterprise and Business support customers. This Permission Checks the S3 buckets that are publicly accessible due to policies or ACLs that will enable you to write/read access for any user. Enterprise and Business support customers can utilize the Permission checks for automated action through AWS Trusted Advisor’s integration with the Amazon CloudWatch Events. 

Amazon has announced that AWS Certified Solutions Architect Associate Exam is now Available

AWS Certified Solutions Architect Associate Exam is now available with the new updated version. It is included in the architectural practices and new service that consist the pillars of the Well Architected Framework. With this Exam, you will have the knowledge of how to deploy and architect robust and secure applications on Amazon Web Service Technologies. The Candidate who wants to attend this Exam should have One-year Experience designing cost-efficient, fault-tolerant, scalable and distributed systems on Amazon Web Services. 

Tuesday, 20 February 2018

AWS Certified Developer Associate Beta Exams is open for registration

Amazon Web Services is hosting a beta for an updated version of the AWS Certified Developer Associate Exam. Individual those have the knowledge to develop and maintain applications effectively on Amazon Web Services are recommended for this beta Exams. It contains questions regarding the AWS best practices, Services and Features. You can register yourself today where the beta exam is 75 US$ which is 50 percent of the cost of the standard Developer Associate Exam. The beta Exam is scheduled for February 19 until March 16, 2018. Space is limited for the registration of the beta exam so it will close when the capacity is reached. The beta exams are available for 90 days or less from the close of the beta exams. Candidates who pass the exams will receive a voucher to attend AWS Certified DevOps Engineer Professional Exam at 50% off which is valid for 12 months. 

Amazon Elasticsearch Service is available now in the Amazon Web Services GovCloud (US) Region

AWS has made Amazon Elasticsearch Service available in AWS GovCloud (US) Region. Amazon ElasticSearch is a managed service that provides Elasticsearch’s easy to use APIs and real-time capabilities such as scalability, availability and the security that will be needed for the production workloads. Amazon Elasticsearch is available in 17 regions all around the world. Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Asia Pacific (Seoul), Asia Pacific (Mumbai), Canada (Central), South America (Sao Paulo), US East (N. Virginia), US East (Ohio), US West (Oregon), US West (N. California), AWS GovCloud (US), EU (Ireland), EU (London), EU (Frankfurt), EU (Paris), and China (Ningxia), operated by NWCD.

Monday, 19 February 2018

Orion Health launches Rhapsody Integration with Amazon Web Services

Orion Health has announced that they have now integrated with Amazon Web Services for Rhapsody-as-a-Services which is a full-service model to the Global Healthcare Sector. It is been released with the RaaS's Global Integration as a Service which will maintain and manage the entire integration engine for users. Ian McCrae, CEO of Orion Health said that the migration from on-premise servers to the cloud-based services leverages business. As the Healthcare Center Sector leverage’s from the benefits of migrating to the cloud as any other industries have with the recent development in healthcare certification and compliance. Orion Health wants to enhance their solutions so that they can help healthcare organization transform the IT systems which will help them provide improved care. 

Amazon Web Services CodeStar is available now in Asia Pacific (Seoul) Region

AWS CodeStar helps the developers quickly build, develop and deploy applications on Amazon Web Services. It is now available in the Asia Pacific (Seoul) Region. AWS CodeStar delivers unified user interfaces that will you to easily manage the software development activities in one place. With CodeStar you can configure a continuous delivery toolchain in minutes that will allow you to start releasing the code faster. Amazon Web Services CodeStar also incorporates with AWS Cloud9 which is a cloud-based IDE for debugging, writing and running code with just a browser. 

Cloud will enhance your Technology Operation and Performance

To make your Business more productive you will need to improve the performance Technology Solutions. Well, this can be possible with Cloud but as years passed by many tech resources are migrating to the cloud so the techniques used to monitor the performance will no longer word. As the applications are migrated to the Cloud you will require more specific tools to help you see how the solutions are operating. IT profession that follows traditional monitoring setups focuses their attention on Infrastructure and Servers. Server’s CPU, RAM, hard disk space and other hardware aspect is been monitored to see if everything is functioning properly and optimally. But when the business migrates to the cloud then there is no need to access such elements or you also may not access a server ever again. 

Saturday, 17 February 2018

SUSE has expanded its Service Relationship with Amazon Web Services Collaboration



SUSE has announced that they will now allow AWS to resell SUSE Linux Enterprise Server for SAP application on Amazon Web Services Marketplace. AWS customers operating SAP workloads on SUSE Linux Enterprise Server for SAP application will get a streamlined and incorporated support experience from SUSE and AWS. The AWS customers can purchase SUSE Linux Enterprise on demand by pricing at pay as you go. 

This collaboration will meet the customer demand for cost benefits and agility of cloud-based business-critical applications. Naji Almahmoud, Vice President of Global Alliances, SUSE said that there are many enterprises that are using SUSE for mission-critical computing environments that consist also of SAP workloads. With this collaboration, the AWS marketplace will expect continued collaboration which will help them in fulfilling customers demand on the requirement of SAP workloads on Amazon Web Services. Bas Kamphuis, GM of Amazon Web Services said that they are excited about this collaboration with SUSE to offer SUSE Linux Enterprise Server for SAP application on Amazon Web Services Marketplace. 

The customers will be offered Amazon Business support by providing a single point of contact. He also added that the collaboration is build to help the customer to scale and deploy easily SAP HANA workloads on SUSE Linux Enterprise Server for SAP application cost-effectively. 

Amazon Polly WordPress Plugin is available on Bitnami Amazon Machine Image



Amazon Web Service has released Amazon Polly Plugin for WordPress on February 8 which will allows the users to easily voice content and launch podcasts directly from the website. Using Audio for Content will provide a different way of consuming the content and it become more user-friendly. Amazon Polly Plugin is now available on the Popular Bitnami Amazon Machine Images. On the Amazon Web Service MarketPlace you can find the WordPress Multiside Image Packaged Bitnami and WordPress Image Packaged by Bitnami which come preinstalled with the Amazon Polly Plugin for WordPress. WordPress for Production or WordPress Multi-Tier bitnami solution will be also available. You can install the Text-to-Speech application by utilising the intergrated Amazon Polly Plugin directly from the Bitnami WordPress Stack and you can then start voicing the content within minutes. 

AWS revenue is booming with Amazon taking over Microsoft in Market Capitalisation



Amazon market capitalisation surpassed Microsoft with US$ 702.5 billion. Amazon and Microsoft have their own industries providing services in different niches where Amazon is in retails and Microsoft provides services in productivity software and desktop. But in Cloud Service both companies are competing with each other. Amazon had reported about financials for the first time in Cloud Business Section. 

Amazon Web Services Sales was estimated in 2017 was US$17.4 billion and in 2016 AWS had a sales of US$ 12.2 billion. Microsoft also at the same time said that Azure revenue is growing at 98 percent year over year where in the last quarter they made US$7.8 billion. Amazon’s revenue is been quadrupled over the past three years where the company has offered 10 percent of its total revenue growth to Amazon Web Services. But Amazon is still behind Apple that worth US$849.2 billion and Alphabet worth US$744.8 billion (Google) where Microsoft is at the fourth place. 

Amazon CEO Jeff Bezos is the richest man in the world where surpassing Microsoft founder Bill Gates with net worth of US$100 Billion in 1999. Jeff Bezos has a net worth of US$118 billion according to the Bloomberg Billionaire’s Index. Amazon is become officially as the third most valuable company in United States. 

BuckHacker Search Tool enables users to navigate through unsecure Amazon Web Services buckets



BuckHacker Search Tool is developed by the White Hat Hackers that has been launched to allow anyone to search for any unsecured data stored on the Amazon Web Services Servers. The Buckhacker plugin builds a Google-like search engine that will navigate through the AWS servers knows as buckets so that they can find data that potentially host sensitive or misconfigured data that is left exposed to the Internet. There are many companies who have failed to follow certain security protocols. 

With increasing rate of data leaks over the past years, the top companies are storing customers and the client’s data on the AWS servers without the password protection. This can lead the customer’s data to be accessed by anyone with the bucket address. Buckhacker is trying to make the process faster which will allow the users to search the AWS listings by using the file name or the bucket that could be associated with the company. They are in fact trying to create awareness among the public concerning about their data security instead of aiding the hackers. 

The BuckHacker Search tool developer said that it is designed so that it can gather results and store them in the database for other users to can view this information. BuckerPlugin wasn’t the first tool that helps to find vulnerabilities in the bucket security there is also AWSBucketDump was launched to allow the user to search leaky AWS buckets but Bluehacker is much more user-friendly to use. 

Amazon Cloud Directory has made its regional availability in EU (Frankfurt) Region

Amazon Cloud Directory enables its users to build flexible, cloud-native directories for arranging hierarchies of data along with multiple dimensions. You can now create directories for various use cases such as device registries, organizational charts, and course catalogs. You can build an organization chart that can be operated via separate hierarchies for a cost center, reporting structure and location. Amazon Cloud Directory has announced Regional Expansion in EU (Frankfurt) Region. 

Introducing Real Time Insights on Amazon Web Services Account Activity

Real-time Insight on AWS Account Activity will help the users to conveniently monitor the AWS account activity in real time. This will provide services that are required to visualize and record resource usage and access metrics for the AWS account. Real-Time Insights on AWS account activity is build to deliver a framework for visualizing metrics in real time where it allows you to focus on adding more metrics instead of underlying infrastructure operations. Real-time insights will help the customers know who are using the resources and how the resource are being utilized. 

AWS Configuration now adds support for AWS Web Application Firewall Rule Groups

In AWS Web Application Firewall RuleGroups with the AWS Config, the users can now record configuration changes. Amazon Web Services Web Application Firewall helps in securing the web applications from common web threats and exploits. RuleGroup is a group of predefined rules that can be added to the Web ACL. By using the Config you can now track historical changes to the rules and metrics that are linked with the Web Application firewall RuleGroups. Amazon Web Services Config. Supports AWS Web Application Firewall in all the AWS public regions where the WAF is available. 

Friday, 16 February 2018

Splunk - Amazon Web Services Serverless Applications

AWS Splunk


Amazon Web Services has announced AWS Serverless Application Repository that will allow AWS customers to deploy, publish and discover serverless applications for stream processing, data processing, Internet of things device data telemetry and much more. A Serverless application follows the AWS Serverless Application Model format which is an AWS CloudFormation Template that packages all the resources that will be required by the customer to deploy a serverless architecture.

Splunk is an AWS Advanced Technology Partner with the Amazon Web Services Competencies in Security, Education, Big Data, DevOps, Education and Government IoT. Splunk software and cloud services help the customers with the business insights, prevent fraud, reduce cost, mitigate cybersecurity risk and improve service. The Splunk AWS Serverless Applications is available via the Amazon Lambda console and it enables the customers to consume TBs of data into Splunk. As a Lambda blueprint launch partner in 2015 Splunk has released two blueprints till now. As the demand for Serverless adoption has increased Splunk has added more purpose-built blueprints that are established on customer requests for different data sources.

Splunk Lambda BluePrints:-

Splunk collaboration with Amazon Web Services has included six of its existing Splunk Lambda blueprints in the Splunk Serverless Application which is available in the Serverless application Repository.

Splunk Logging: From the AWS Lambda the log events can be transitioned to Splunk’s HTTP event collector.

Splunk DynamoDB stream processor: You can now stream the Amazon DynamoDB events into the Splunk’s HTTP collector.

Splunk Elastic Load Balancer application access logs processor: You can Stream Application Load Balancer access logs to Splunk’s HTTP event collector from Amazon Simple Storage Service.

Splunk Elastic Load Balancer classic access logs processor: You can Stream Classic Balancer access logs to Splunk’s HTTP event collector from Amazon Simple Storage Service.

Splunk IOT Processor: You can now stream AWS IoT Core events to Splunk’s HTTP event collector.

Splunk Kinesis Stream Processor: Events from Amazon Kinesis Video can be streamed to Splunk’s HTTP event collector.

You can sign up for a private preview to get access to the AWS Serverless Application Repository. In AWS Lambda console when you are building a new function you can choose Serverless Application Repository and search for Splunk for multiple purpose-built in Serverless applications.

Splunk Serverless apps that are open-sourced to get better insights including the AWS Serverless Application Model. Splunk AWS Serverless Application enables the customers to benefit from the cost-effectiveness, flexibility, and scalability of serverless computing. This will help to aggregate, prioritize and trace the data by exploring the data insights tool which was jointly built by Splunk and AWS Marketplace.






Amazon Workspaces Application Manager supports Windows 10

Amazon Workspaces Application Manager can be utilized to deploy applications to Amazon Workspaces that are operating on windows 10. Workspaces Application Manager provides users on-demand access to centrally managed desktop applications and applications that are available via the Amazon Marketplace for Desktop Apps. Amazon WorkSpace Application Manager utilizes application virtualization technology to simplify the application that provisions and offers fine grain controls to the IT administrators all from the Amazon WorkSpace console. With this new release, you can use the Amazon WorkSpace Application Manager on Workspace operating on Windows 7 or Windows 10. 

Amazon GameLift Spot instances and FleetIQ will save up cost by up to 90 percent

Amazon GameLift introduces Spot instances and FleetIQ. The Spot instances will provide customers access to free up the AWS computing capacity at savings up to 90 percent compared to the On-Demand prices. These savings can be achieved where at the same time maintaining the high game server availability with the use of FleetIQ which is a new feature of Amazon GameLift Queues that will place new session on game servers which are based on the Spot interruption rates, player latencies and instances prices. Amazon GameLift is a dedicated game server matchmaking and hosting solution for games. 

Amazon Polly Phonation Tags can now generate Softer Speech

Amazon Web Services announced a new phonation tag that can utilize the Amazon Polly an API based Text to Speech Service. You can now generate a softer speech with any of the Amazon Polly voices with the new phonation Speech Synthesis Markup Language tag. Amazon Polly also supports other Speech Synthesis Markup Language tag support other features such as dynamic range compression, vocal tract length, prosody and whispered voice which enables you to customize the voices according to your preference. You can visit the Amazon Polly documentation on SSML tags for more information and you can try this new feature via the Amazon Polly console. 

Thursday, 15 February 2018

Best Practices of IAM security for Advanced Security Protocols on AWS cloud

AWS IAM Security


Amazon Identity and Access Management service is a web service that provisions security control access to the Amazon Web Services Resources. Identity and Access Management can be used to control to who is permitted for Sign-in and for the authorization of the permission to utilize the resources. In this article, you will be learning on how to enhance the IAM security with a just simple task where you can implement it to the AWS Cloud. 

First, check the security status column of the Identity and Access Management Console. If there are alerts listed then you have come to the right place because in this article I will be showing you how to fix this issue which will further improve the security protocols. 

Eliminate the Root access Keys:

Root Account in the Amazon Web Services is a powerful account which can’t be restricted in any way. The Root Account should be utilized to create only new users and to authorize permission accordingly but not frequently. The access keys in the Root account should be eradicated as it will be most likely to be utilized for console access and not the Amazon Web Services CLI or SDK access. 

To eliminate root access key you need to click "The Delete your root access keys" on the Security status and tap on the Manage Security Credentials. Your Security Credentials page will be then displayed. Tap on the Access Keys section. By doing this there will be list of action keys displayed so delete all of them by utilizing a delete link on the right side and confirm the dialogue box that will show up. 

Creation of Users and groups:

It is suggested to create a user to manage Amazon Web Services because of the importance of root account. If there is more than one user then you need to authorize permission to different team members. But having a group become’s essential because it is easy to manage and delegation is consistent. 

Password Policy:

Click on the account setting to configure a password policy which is on the left side. Configure the parameters and values that you want to set and after doing that tap on the Apply password policy. 

Multifactor Authentication:

The Amazon Web Services Console or the access key is not available for the users in the company but only for chosen or only limited users can access the AWS console. Multifactor authentication is an essential security feature that will be applied to all the users which should be implemented in the root account. To use Multifactor Authentication in the root account you will need to tap on the Activate Multifactor Authentication on the root account and tap on Manage Multifactor Authentication. For an IAM user, you need to tap on the desired user and then tap on the Security Credential tap and then tap on the edit icon on the Assign Multifactor Authentication device. To enable Multifactor Authentication you need to fulfill the list of procedures which in the first instance you have to choose from a hardware or virtual device. The virtual device does not include cost so you can use an app such as the Google Authenticator. In this article, Virtual MFA device will be used and after selecting it you have to tap on Next Step.

Dialogue Box will appear by informing you to install the applications. Go to your device and install the Google Authenticator on it and it will be ready to add an account and then click on the next step. Now you have to configure the QR code by clicking on the add button and face the device to the QR Code which will identify the device and create an entry on the Google Authenticator. After all the procedure is followed properly a dialogue box will appear informing that the Multifactor devices are successfully installed and configured in the account and after that, you need to click on Finish. 

Once you have enabled the Multifactor authentication, for every new login on the AWS console a code will be displayed on the Google Authenticator which will be needed to permit access. 

Track IAM user’s utilization:

After enabling the Multifactor authentication for enhancing the security you can now add a built-in feature to keep a record of the IAM user’s utilization, status of their credentials, MFA information and last time login account activity. It is recommended to remove the accounts that are no longer in use. 

Tap on the Credential Report and then on the Right Side, there will be a Download Report which will contain CSV file. In that CSV file, the admin can check the user's utilization report and much more. After doing all this procedure you have to refresh the Dashboard page of the IAM console and you will see that all the status are been complete. 

CloudStax FireCamp can be deployed on the AWS Cloud with the new Quick Start

The New Quick Start deploys CloudStax Firecamp automatically into a customizable environment on the Amazon Web Services Cloud in just 30 minutes. CloudStax FireCamp is an open-source platform for managing the stateful services in the containers. It will support the two container orchestration services that you use such as Docker Swarm and Amazon Elastic Container Service. FireCamp makes it convenient to manage, scale and configure stateful services on AWS by utilizing three availability zone for automatic failover and high availability. FireCamp incorporates with many open-source stateful services including Elasticsearch, Cassandra, Kafka, Redis, MongoDB and PostgreSQL so you can now easily operate these services with no management overhear on Amazon Web Services. 

Amazon Elastic Container Service can be configured with Auto Scaling by using the Target Tracking Policy

Service Auto Scaling can be set up by utilizing the Target Tracking Policy for the containerized services directly from the Amazon Elastic Container Service Console. With the Target Tracking, you can now choose a load metric for the service such as Request Count per Target and Average CPU utilization and then set the target value. Then the AutoScaling adjust the number of a task operating for the service to maintain the target value. Earlier you could only auto scale the Amazon Elastic Container Service utilizing the step scaling policies which will need you to build and manage Amazon CloudWatch Alarms for the metrics that you will want to trigger scaling adjustments. 

Amazon DynamoDB Accelerator added more regions availability and adds support for new T2 instances and Released SDKs for Python and .NET

Amazon DynamoDB Accelerator has announced the release of SDKs for .NET and Python. It has also added support for T2 instances in all the supported regions. T2 instances will deliver a baseline level of CPU performance and will be more cost-effective for development and test workloads or production applications that need small caches. DynamoDB Accelerator is now available in the Asia Pacific (Sydney) and the Asia Pacific (Singapore) Region. Amazon Dynamo Accelerator is highly available and fully managed in memory-cache for DynamoDB that will provide up to 10x performance improvement from microseconds to milliseconds even at million of request per second. 

Wednesday, 14 February 2018

AWS AppSync adds new GraphQL functionality and Removes Whitelist Approvals from preview

Amazon Web Services AppSync Preview now adds new features for GraphQL features to improve application development and also remove the white-list only access to utilize the service. Customers can now auto-generate the GraphyQL Schema and can resolve from the existing Amazon DynamoDB table without any coding. The users can immediately perform mutations and queries by the utilizing the generated API endpoint and set up it to use the GraphQL subscriptions for real-time data. AWS has also added support for GraphQL unions and interfaces are also added to AWS AppSync as well as the Local Resolver which will allow you to deploy subscriptions without mutating the data. 

Amazon Lex Chatbot Schema now enables export and import

The users of Amazon Lex ChatBot Schema can now export and import to simplify the process of replication of the chatbot for deployment and development. Amazon Lex provisions the ability to import and export the Amazon Lex Chatbot definition as a JSON file. The JSON configuration file will include the structure of the Amazon Lex Chatbot which will also consist of slots, slot types, prompts and intent schema with utterances. When the users export the bot schema file they can remodify it and they can import them to the different account or the same account. This same function can be utilized to replicate the chatbot to another region as the customer is configuring highly available multi-region architecture. 

Network Load Balance is available now in EU (Paris) Region

Network Load Balancer is build to manage millions of requests per second where at the same time retaining ultra-low latencies; this service is now available in AWS EU (Paris) Region.  Network Load Balancer is enhanced to manage volatile traffic patterns while utilizing a single static IP address per the availability zone. It works at the connection Level Layer 4, IP based on the IP protocol data and routing connections to Amazon Elastic Cloud Compute Instances. It can also retain the client side source IP which will allow the applications to see the IP address of the client that can be utilized by the applications for further processing. 

Tuesday, 13 February 2018

Amazon Lex has got responses capability and SSML support in a text response

Amazon Lex Chatbot can now define a response from the AWS Management Console. The response will include messages chosen from a group of pre-defined messages which is generated by the developer. Messages in these messages groups can be a simple text or you can utilize a custom markup. You can also showcase a response card as part of the response message. This update will enhance and simplify the creation of dynamic conversation with Amazon Lex. You can also use the Speech Synthesis Markup Language in the text response. 

In Amazon Elastic Cloud Compute Longer Format Resource IDs are now available

AWS had announced in Dec 2017 that Amazon Elastic Container Service, AWS Storage Gateway, and Amazon EC2 resources will have longer identifiers to support the ongoing development of Amazon Web Services. Now you can use longer IDs by using the APIs or the AWS Management Console in Amazon Elastic Cloud Compute. The users can now test their longer format and can opt-in when they are ready at the end of June 2018. After June 2018 the new resources will be built with longer IDs by default. The existing resources won’t be affected only the new resources will be getting longer IDs. 

Amazon WorkMail has added support for MAPI over HTTP Protocol

Amazon WorkMail has announced that they have added support for the MAPI over the HTTP Protocol for Micrsoft Outlook for Windows. This protocol will enhance the connection stability between the Amazon WorkMail and Microsoft Outlook. This will offer users a proactive overall connection to Amazon Workmail and more responsive experience.  Microsoft Outlook 2010, 2013 and 2016 with the latest service pack will automatically transition to the MAPI over HTTP Protocol to next time they are opened. Users who have Microsoft Outlook 2007 can connect to Amazon WorkMail by utilising the RPC over HTTP Protocol. This updates is available in all the Amazon Web Services Region where Amazon WorkMail is made available. 

Monday, 12 February 2018

Amazon Elastic Container Service now adds New Endpoint to Access Metadata and Task Metrics

Container Level Docker statistics and Task Metadata can be now queried for tasks that are launched utilizing the AWS Virtual Private Cloud Network mode. This will give you the environmental data such as the container, Image ID and task whereas you can also check the health and status of the running task and containers. Earlier the users could only access the container statistics and metadata by querying in a metadata file on the Host Elastic Cloud Compute Instance. Docker Statistics and Metadata can be queried from the HTTP endpoint by using the RESTful API call for task launched utilizing the AWS Virtual Private Network mode which will make it easier to query the statistics and metadata. 

AWS Direct Connect has now landed its second site which is now available at Paris at Equinix PA3, Paris and France

AWS Direct Connect Site is now available at France, Paris at Equinix PA3 and Paris after the region launch in December 2017. You can sign up from the Management Console under the EU (Paris) Region. AWS Direct Connect has also launched its first site in Taiwan at Chief Telecom LY, Taipei. Tapie is located in the Asia Pacific (Tokyo) Region in the Management Console. The customers in the Paris and Taipei can now create a dedicated network connection from the premises to the Amazon Web Services. With a Global access, it will allow the AWS Direct Connect user’s site to reach AWS resources in any Global AWS Region. 

Meet Amazon Q and Make Every Workday a Breeze

Imagine this: You’re rushing to meet a deadline. There’s a bug you can’t fix, tests you haven’t written, confusing docs, and now your AWS se...