Pakistan's First Oracle Blog

Subscribe to Pakistan's First Oracle Blog feed
Blog By Fahd Mirza Chughtai
Updated: 8 hours 37 min ago

Failed to start The nginx HTTP and reverse proxy server on RedHat EC2 Linux

Tue, 2021-10-19 21:55

 I needed a simple reverse proxy to redirect connections to a RDS database in a private subnet, so I quickly created a Redhat Linux EC2 instance, installed NGINX, and setup the nginx.conf file for session redirection. My nginx.conf looked like following:

user nginx;

worker_processes auto;

error_log /var/log/nginx/error.log;

pid /run/nginx.pid;

include /usr/share/nginx/modules/*.conf;

events {

    worker_connections 1024;

}

stream {

    upstream target_server {

        server targetdb:1521;

    }

    server {

        listen 1521;

        proxy_pass target_server; }

}


But starting ngnix process was giving following error:


[root@test nginx]# systemctl start nginx

Job for nginx.service failed because the control process exited with error code. See "systemctl status nginx.service" and "journalctl -xe" for details.

[root@test nginx]# systemctl status nginx.service
● nginx.service - The nginx HTTP and reverse proxy server
   Loaded: loaded (/usr/lib/systemd/system/nginx.service; enabled; vendor preset: disabled)
   Active: failed (Result: exit-code) since Wed 2021-10-20 13:40:57 AEDT; 5s ago
  Process: 14702 ExecStartPre=/usr/sbin/nginx -t (code=exited, status=1/FAILURE)
  Process: 14700 ExecStartPre=/usr/bin/rm -f /run/nginx.pid (code=exited, status=0/SUCCESS)

Oct 20 13:40:57 test systemd[1]: Starting The nginx HTTP and reverse proxy server...
Oct 20 13:40:57 test nginx[14702]: nginx: [emerg] unknown directive "stream" in /etc/nginx/nginx.conf:9
Oct 20 13:40:57 test nginx[14702]: nginx: configuration file /etc/nginx/nginx.conf test failed
Oct 20 13:40:57 test systemd[1]: nginx.service: control process exited, code=exited status=1
Oct 20 13:40:57 test systemd[1]: Failed to start The nginx HTTP and reverse proxy server.
Oct 20 13:40:57 test systemd[1]: Unit nginx.service entered failed state.
Oct 20 13:40:57 test systemd[1]: nginx.service failed.

Solution: Just install nginx-mod-stream

[root@test nginx]# ls -ltr /usr/lib/nginx/modules/ngx_stream_module.so
ls: cannot access /usr/lib/nginx/modules/ngx_stream_module.so: No such file or directory
[root@ip-10-219-40-147 nginx]# yum install nginx-mod-stream

Now if you start nginx service, it should work.
Categories: DBA Blogs

Introduction to Oracle Blockchain Tables in 21c

Mon, 2021-08-16 18:32

When I first heard that Oracle is now available in blockchain few years back, I fell from my chair in awe that how quickly they have adopted this new bleeding edge technology. I thought now Oracle corporation would also release its own cryptocurrency too. But then I learned that a blockchain oracle is not that Oracle. 


A blockchain oracle is just a service which inputs required information to smart contracts. A smart contract is just a program in blockchain which creates a immutable agreement between stakeholders without a third party.

Anyway, at last Oracle has dipped its feet into the blockchain technology it seems. In Oracle 21c, we have now something called as blockchain tables. Blockchain is simply a linked chain of blocks which store ledger of transactions verified and duplicated by all the nodes based on a complex cryptographic algorithm. This transparency, decentralization, and cryptographic foundation makes it more secure and eliminates the need of third parties. 

Oracle blockchain is not really that decentralized. It centralized the whole blockchain within the Oracle database. Blocks in a blockchain table are chained together by row hashes. You can only insert data into these tables. Rows can be deleted but only restricted to some retention value.


CREATE BLOCKCHAIN TABLE cardano_ledger (ada_id NUMBER, ada_tokens NUMBER)

                     NO DROP UNTIL 365 DAYS IDLE

                     NO DELETE LOCKED

                     HASHING USING "SHA2_512" VERSION "v1"



You can then check attributes of blockchain table in user_blockchain_tables.


You can delete rows in a blockchain table only by using the DBMS_BLOCKCHAIN_TABLE package, and only rows that are outside the retention period. You can only increase the retention period and not decrease it.


DECLARE

   NUMBER_ROWS NUMBER;

BEGIN

   DBMS_BLOCKCHAIN_TABLE.DELETE_EXPIRED_ROWS('TestUser','cardano_ledger', null, NUMBER_ROWS);

   DBMS_OUTPUT.PUT_LINE('Number of rows deleted=' || NUMBER_ROWS);

END;

/  

 

And you cannot truncate a blockchain table in Oracle database.


Now does Oracle blockchain table is really a blockchain? It has compromised decentralization and it conditionally mutable. I will leave the decision to you. 


For more practice of Oracle blockchain table, check the Oracle Docs.


Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 8

Thu, 2021-07-01 21:55

 Welcome to the next edition of weekly Cloud Vanity. By 2025, 85 percent of companies will have containerized applications in production, as per Gartner. Due to their very nature, containerized apps have to be orchestrated. As of now, K8s is the most viable option. To take the edge off the K8s complexity, we have EKS, AKS, ACK, and GKE among various other options. It's going to stay here one way or another.


AWS:

The iOS clients rely on Amazon Cognito for authentication and authorization, provisioned using the Amplify CLI authentication category. The Amplify DataStore is used to persist the data locally on the client side, providing capabilities that allow automatic synchronization of data between devices and the cloud.

MYCOM OSI offers assurance, automation, and analytics software as a service (Saas) applications for the digital era. The Assurance Cloud Service™ provides critical end-to-end performance, fault and service quality management, and supports AI and machine learning (ML)-driven closed-loop assurance for hybrid, physical, and virtualized networks, across all domains, within a SaaS model.

Binlog replication is a popular feature serving multiple use cases, including offloading transactional work from a source database, replicating changes to a separate dedicated system to run analytics, and streaming data into other systems, but the benefits don’t come for free.

With AWS Glue DataBrew, data analysts and data scientists can easily access and visually explore any amount of data across their organization directly from their Amazon Simple Storage Service (Amazon S3) data lake, Amazon Redshift data warehouse, Amazon Aurora, and other Amazon Relational Database Service (Amazon RDS) databases. You can choose from over 250 built-in functions to merge, pivot, and transpose the data without writing code.

Everyone has their favorite integrated development environment, or IDE, as it’s more commonly known. For many of us, it’s a tool that we rely on for our day-to-day activities. In some instances, it’s a tool we’ve spent years getting set up just the way we want – from the theme that looks the best to the most productive plugins and extensions that help us optimize our workflows.

Azure:

Run a "provisioning" automation for a completely hands-off experience for instrumenting and monitoring any new applications that you create and deploy—using Terraform or ARM Template. Or you can run it on-demand using the Azure CLI for greater flexibility and control.

As defined by Gartner, AIOps enhances IT operations through insights that combine big data, machine learning, and visualization to automate IT operations processes, including event correlation, anomaly detection, and causality determination. 

From edge to cloud, companies are eager to find innovative solutions that meet them where they are. Today’s business environment is increasingly complex, and customers tell us they need solutions that are multi-cloud, platform-agnostic, and offer integrated apps and services that are always up to date. 

This article introduces MSIX & a deep dive/walkthrough on MSIX App Attach, Microsoft’s layering solution for delivering applications to a modern workspace.

A virtual network is nothing but, like On-premises network which we use switches and routers to communicate with servers and clients as same as Azure VNet is also used for communicating with azure resources. (Virtual Machine, Databases, etc.)

GCP:

The financial services industry is changing—an estimated $68 trillion in wealth transferring from baby boomers to millennials.

For Data Preview 0, the IDF leverages Cloud Storage, Google Kubernetes Engine (GKE), and Compute Engine to provide the Rubin Observatory user community access to simulated LSST data in an early version of the RSP. 

Digital native companies have no shortage of data, which is often spread across different platforms and Software-as-a-service (SaaS) tools.

Many of our customers want to know how to choose a technology stack for solving problems with machine learning (ML).

Setting up Cloud Monitoring dashboards for your team can be time consuming because every team's needs are different. 

Others:

This article describes how to use the ORAS client to push Wasm modules with allowed media types to the Alibaba Cloud Container Registry (ACR) (an OCI-compatible registry).

Before the Dubbo-go development tests started recently, samples were used for functional verification. Now, to reproduce a functional problem, a project of Dubbo-go and Dubbo calling has been planned from scratch to record the potential problems newcomers might encounter.

They transform mission-critical business systems by automating transactions and processes. Functions like campaign to lead, order to cash, procure to pay, incident to resolution, concept to market and hire to retire can now all be optimized and accelerated by artificial intelligence.

HPE helps customers recover from cyberattacks in minutes with acquisition of data protection leader, Zerto

Simplify your software deployments with the Oracle Cloud Infrastructure DevOps service

Categories: DBA Blogs

Best Ultimate Definition of DevOps

Thu, 2021-06-24 23:50

 DevOps enables us to build software in a faster and reliable way. 

That's it.

Yes that's pretty much all there is to it. 

Over the decades, what was marring software development was that software was taking too long to build because any serious software needs multiple teams to develop, stored, build, test, integrate and deploy. 

Earlier, on every step, teams were working in silos taking their sweet time to finish their part and chucking the software over the fence to the next team. Without any automation and collaboration between these teams software was taking longer to build and end product was always replete with bugs. 

DevOps is a natural evolution from that process where code is stored in single repo and all developers work in single repo for their application, then there is a single automated build process for that repo, which then get's tested on the way to deployment. Agile methodology and shorter sprints ensure that software passes through the phases in days and not in months and bugs are identified and rectified as early as possible in the cycle.

Then comes the slew of tooling available ranging from Git or bitbucket and others for source code, Jenkins or TeamCity and others for build and testing, and then either automatic deploy or manual deploy to different environments.

I hope that helps.

Categories: DBA Blogs

EKS Login error: You must be logged in to the server (Unauthorized)

Tue, 2021-06-22 09:00

 The reason why you would receive following error when you try to access an EKS cluster is that the user who created the EKS cluster is different what you are using to run kubectl commands:

error: You must be logged in to the server (Unauthorized)


In order to resolve this, either use aws-iam-authenticator or use the same user to run your kubectl commands with which you created the cluster. I  normally like to work with the kubeconfig file present so I use the same user for both creating EKS cluster and running kubectl commands. I also set my config and credentials in AWS as follows:


AWS .config file entry:

[profile eks-np]

region = ap-southeast-2

output = json


AWS credentials file entry:

[eks-np]

aws_access_key_id = <key id>

aws_secret_access_key = <access key>

region = ap-southeast-2


and then run following command to update your .kube file with cluster and context info:

aws eks --region ap-southeast-2 update-kubeconfig --name ekscluster --profile eks-np


I hope that helps.

Categories: DBA Blogs

3 Tips For Mental Health as Cloud Engineer

Mon, 2021-06-21 17:52

 It's simple. Our brain is our biggest asset when it comes to being a cloud or devops engineer. If you start loosing your mind or if its chaotic and unsound, then you won't be a cloud engineer for long for sure. Following are 3 tips to maintain your mental health as a cloud engineer.

1- You don't have to Learn it all 

You don't have to learn all hundreds of services offered by AWS, Azure, GCP, Ali baba, Oracle, etc. Don't stress about learning every devops CICD tool under the sun. You are not supposed to learn all there is in cloud w.r.t networking, compute, storage, databases, serverless, streams, ML, AI, integration, and other stuff. Stop falling into that bottomless chasm as you would never be able to get out of it. Just learn what's needed now. Just learn what your background is. Just learn what interests you. Just learn what gets the job done.

2- Don't take Cloud Job postings too seriously

One of the ways I was checking what's required by today's businesses in terms of skillset was to review cloud job postings. After going through the skills required, I was getting very depressed as the list was long. Then I started to realize that it was humanly impossible for anyone to possess all those skills in fullest. How can I be a full stack developer with rich networking background, having worked in true agile environment remaining abreast of CICD pipelines with a must docker and kubernetes 5 year experience, topped by some SQL and NoSQL database skills. Job posting just don't end there either. You have to know all those tools like DCOS, K8s, Maven, Ant, Gradle, TeamCity, Jenkins, Bash, Powershell, Python, Ruby, MongoDB, Kafka, Oracle, IAM, OKTA, AD, ELK, and plethora of other acronyms. Trust me, most of these job postings are unrealistic and they know it. If you foot even half of the bill in these job postings and are good at what you do, you should be fine. 

3- Keep Learning Enjoyably

Don't switch off and don't go to some remote monastery to meditate. We have made these careers after lots of hard work. We have to stay relevant in the industry. We have to pay mortgages and we have to provide for our family. What really stresses is the fear of lagging behind and going obsolete. We keep doing this cloud stuff because we enjoy doing it. What bothers is the endless list of things to learn. Just follow above 2 points and this stress should be very manageable, and then you can just focus on things which you really should be learning and what you enjoy learning. 

Happy learning and progressing.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 6

Thu, 2021-06-17 21:55

 Welcome to the next edition of weekly Cloud Vanity. There is so much razzmatazz about cloud that for a second we tend to think that every company is already on some sort of cloud. A recent study has found out that still only 35% of companies are in the cloud and rest of them are still thinking or planning to migrate. So still lot of opportunity there.

AWS:

Announcing the AWS Security and Privacy Knowledge Hub for Australia and New Zealand

Few things have changed the world more than the internet, and at the heart of the internet is the open source LAMP stack. LAMP, short for Linux, Apache, MySQL, and PHP, enabled developers to build new, interactive web experiences

Australian Commonwealth Government agencies are subject to specific requirements set by the Protective Security Policy Framework (PSPF) for securing connectivity between systems that are running sensitive workloads, and for accessing less trusted environments, such as the internet.

Today, Dave Brown, VP of Amazon EC2 at AWS, announced the Graviton Challenge as part of his session on AWS silicon innovation at the Six Five Summit 2021.

AWS Step Functions allow you to build scalable, distributed applications using state machines. Until today, building workflows on Step Functions required you to learn and understand Amazon State Language (ASL).

Azure:

Now’s the time to register for the free Azure Hybrid and Multicloud Digital Event on Tuesday, June 29, 2021, from 9:00 AM–11:00 AM Pacific Time, delivered in partnership with Intel.

For over three years, I have had the privilege of leading the SAP solutions on Azure business at Microsoft and of partnering with outstanding leaders at SAP and with many of our global partners to ensure that our joint customers run one of their most critical business assets safely and reliably in the cloud. 

There are many factors that can affect critical environment (CE) infrastructure availability—the reliability of the infrastructure building blocks, the controls during the datacenter construction stage, effective health monitoring and event detection schemes, a robust maintenance program, and operational excellence to ensure that every action is taken with careful consideration of related risk implications.

The power of 5G, IoT, and real-time AI will unlock new and innovative services for enterprises across the world to accelerate their transformation toward Industry 4.0 as they evolve and adopt diverse new business models. 

Cloud and edge computing are coming together as never before, leading to huge opportunities for developers and organizations around the world. Digital twins, mixed reality, and autonomous systems are at the core of a massive wave of innovation from which our customers already benefit.

GCP:

As your organization evolves, the cloud can be a powerful tool to drive growth, improve efficiency, and reduce costs. In fact, the cloud is so powerful that most organizations find themselves running on multiple clouds

At its core, Data and Analytics allows us to make impactful decisions by deriving insights from our data. In the pursuit of making data meaningful, data scientists and engineers are often tasked with building end-to-end workflows to ingest, process and analyze data.

At Google Cloud, we believe moving to the cloud shouldn’t have to mean starting over from scratch. That’s why we’re on a mission to give you choices for how you run your enterprise workloads, including migrating and modernizing your Windows workloads. 

In times of significant disruption, organizations are faced with three choices: Retrench within legacy solutions, pause and do nothing while waiting for more data or different circumstances, or press ahead, potentially even accelerating to realize the desired outcome.

Our collective understanding of work—where it takes place and how it gets done—has been transformed over the last year.

Others:

The Clouds are Thickening: An Overview of The SaaS Ecosystem and Big Cloud Providers

The Move Toward Simplicity: Why a Single-Vendor Approach to AI-Powered Automation Matters

Crafting with XieSandi is an educational simulation video game that describes how to craft different things, such as handcrafting and DIY.

Imagine one hybrid cloud platform that provides the automation, observability and cloud-native capabilities necessary to keep business, technology and teams connected while delivering the best digital experiences now and in the future.

Cloud operating model refers to the operational model used by I&O organization when adopting cloud and trying to execute their cloud strategy. The model is used to transform the traditional IT management within an organization to be capable of managing the clouds that the business consumes.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 5

Thu, 2021-06-10 21:09

 Welcome to the next edition of weekly Cloud Vanity. As usual, this edition casts light on multiple cloud providers and what's happening in their sphere. From the mega players to the small fish on the ocean, it has covered it all. Enjoy!!!

AWS:

Reducing risk is the fundamental reason organizations invest in cybersecurity. The threat landscape grows and evolves, creating the need for a proactive, continual approach to building and protecting your security posture. Even with expanding budgets, the number of organizations reporting serious cyber incidents and data breaches is rising.

Streaming data presents a unique set of design and architectural challenges for developers. By definition, streaming data is not bounded, having no clear beginning or end. It can be generated by millions of separate producers, such as Internet of Things (IoT) devices or mobile applications. Additionally, streaming data applications must frequently process and analyze this data with minimal latency.

This post presents a solution using AWS Systems Manager State Manager that automates the process of keeping RDS instances in a start or stop state.

Over the last few years, Machine Learning (ML) has proven its worth in helping organizations increase efficiency and foster innovation. 

GCP:

In recent years, the grocery industry has had to shift to facilitate a wider variety of checkout journeys for customers. This has meant ensuring a richer transaction mix, including mobile shopping, online shopping, in-store checkout, cashierless checkout or any combination thereof like buy online, pickup in store (BOPIS).  

At Google I/O this year, we introduced Vertex AI to bring together all our ML offerings into a single environment that lets you build and manage the lifecycle of ML projects. 

Dataflow pipelines and Pub/Sub are the perfect services for this. All we need to do is write our components on top of the Apache Beam sdk, and they’ll have the benefit of distributed, resilient and scalable compute.

In a recent Gartner survey of public cloud users, 81% of respondents said they are working with two or more providers. And as well you should! It’s completely reasonable to use the capabilities from multiple cloud providers to achieve your desired business outcomes. 

Azure:

Generators at datacenters, most often powered by petroleum-based diesel, play a key role in delivering reliable backup power. Each of these generators is used for no more than a few hours a year or less at our datacenter sites, most often for routine maintenance or for backup power during a grid outage. 

5 reasons to attend the Azure Hybrid and Multicloud Digital Event

For over three years, I have had the privilege of leading the SAP solutions on Azure business at Microsoft and of partnering with outstanding leaders at SAP and with many of our global partners to ensure that our joint customers run one of their most critical business assets safely and reliably in the cloud. 

There are many factors that can affect critical environment (CE) infrastructure availability—the reliability of the infrastructure building blocks, the controls during the datacenter construction stage, effective health monitoring and event detection schemes, a robust maintenance program, and operational excellence to ensure that every action is taken with careful consideration of related risk implications.

Others:

Anyone who has even a passing interest in cryptocurrency has probably heard the word ‘blockchain’ branded about. And no doubt many of those who know the term also know that blockchain technology is behind Bitcoin and many other cryptocurrencies.

Alibaba Cloud Log Service (SLS) cooperates with RDS to launch the RDS SQL audit function, which delivers RDS SQL audit logs to SLS in real time. SLS provides real-time query, visual analysis, alarm, and other functionalities.

How AI Automation is Making a First-of-its-Kind, Crewless Transoceanic Ship Possible

Enterprise organizations have faced a compendium of challenges, but today it seems like the focus is on three things: speed, speed, and more speed. It is all about time to value and application velocity—getting applications delivered and then staying agile to evolve the application as needs arise.

Like many DevOps principles, shift-left once had specific meaning that has become more generalized over time. Shift-left is commonly associated with application testing – automating application tests and integrating them into earlier phases of the application lifecycle where issues can be identified and remediated earlier (and often more quickly and cheaply).

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 4

Thu, 2021-06-03 19:31

 Welcome to the next edition of weekly Cloud Vanity. Foundation of any cloud matters. Cloud is and always will be a distributed hybrid phenomenon. That is why architecting, developing, and operating a hybrid mix of workload require stable, scalable and reliable cloud technologies. This edition discusses few of them from across different clouds out there.


AWS:

AWS SAM or Serverless Application Model is an open source framework that you can use to develop, build and deploy your serverless applications.

Pluralsight, Inc., the technology workforce development company, today announced that it has entered into a definitive agreement to acquire A Cloud Guru (ACG).

AWS Lambda Extensions are a new way to integrate your favorite operational tools for monitoring, observability, security, and governance with AWS Lambda. Starting today, extensions are generally available with new performance improvements and an expanded set of partners including Imperva, Instana, Sentry, Site24x7, and the AWS Distro for OpenTelemetry.

Amazon SQS is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. 

Let’s say your Python app uses DynamoDB and you need some unit tests to verify the validity of your code, but you aren’t sure how to go about doing this.

Azure:

Personal access tokens (PATs) make it easy to authenticate against Azure Devops to integrate with your tools and services. However, leaked tokens could compromise your Azure DevOps account and data, putting your applications and services at significant risk.

Azure announces general availability of scale-out NVIDIA A100 GPU Clusters: the fastest public cloud supercomputer.

La Liga, the foremost Spanish football league, has expanded its partnership with Microsoft Azure to focus on machine learning (ML), over the top (OTT) services, as well as augmented reality.

A little over a year ago, Microsoft Build 2020 was Microsoft’s first flagship event to become all-digital early in the COVID-19 pandemic.

Generators at datacenters, most often powered by petroleum-based diesel, play a key role in delivering reliable backup power. Each of these generators is used for no more than a few hours a year or less at our datacenter sites, most often for routine maintenance or for backup power during a grid outage. 

GCP:

Having constant access to fresh customer data is a key requirement for PedidosYa to improve and innovate our customer’s experience. Our internal stakeholders also require faster insights to drive agile business decisions. 

5 ways Vertex Vizier hyperparameter tuning improves ML models

Getting started with Kubernetes is often harder than it needs to be. While working with a cluster “from scratch” can be a great learning exercise or a good solution for some highly specialized workloads, often the details of cluster management can be made easier by utilizing a managed service offering. 

Zero-trust managed security for services with Traffic Director

Databases are part of virtually every application you run in your organization and great apps need great databases. This post is focused on one such great database—Cloud Spanner.

Others:

Kubernetes is a robust yet complex infrastructure system for container orchestration, with multiple components that must be adequately protected. 

It is no contradiction to say that being ‘cloud-native’ has not much to do with cloud computing. There is an idea that cloud is a place, a suite of technologies or services that run somewhere in data centres. But the cloud is not a place; it is a way of working.

The most innovative companies of 2021 according to BCG: Alphabet, Amazon, Microsoft all make it.

In this article, the author discusses how cloud computing has changed the traditional approach to operation and maintenance (O&M).

This June, a small marine research non-profit with a huge vision will launch a first-of-its-kind, crewless transoceanic ship that will attempt to cross the Atlantic Ocean without human intervention.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 3

Thu, 2021-05-27 20:48

 Welcome to the next edition of weekly Cloud Vanity. IBM Cloud is not having a good time out there. It was hit by another outage this week, just five days after a similar incident. The root cause was an unidentified "severity-one" incident impacted multiple services across multiple locations. This once again underlines that you need a high availability and DR plan in the cloud too. 


AWS:

Authorizing functionality of an application based on group membership is a best practice. If you’re building APIs with Amazon API Gateway and you need fine-grained access control for your users, you can use Amazon Cognito

VMware Cloud on AWS allows customers to run VMware vSphere workloads on the AWS global infrastructure. This means you can run vSphere workloads across all of the AWS Regions where VMware Cloud on AWS is available.

CloudFormation Guard, an open source tool that helps validate your AWS CloudFormation templates against a rule set to keep AWS resources in compliance with company guidelines.

AWS Security Hub provides a comprehensive view of the security alerts and security posture in your accounts. Now you can import AWS IoT Device Defender audit findings into Security Hub.

Customers who are running fleets of Amazon Elastic Compute Cloud (Amazon EC2) instances use advanced monitoring techniques to observe their operational performance. Capabilities like aggregated and custom dimensions help customers categorize and customize their metrics across server fleets for fast and efficient decision making. 

Azure:

Microsoft has published a root cause analysis of an outage of its Azure Domain Name System that struck the cloud platform over Easter, causing intermittent failures for customers accessing and managing their Microsoft services globally.

Machine learning is changing the way we interact with each other and improving many aspects of our lives. In recent years, a variety of tools and frameworks have been developed to make it easier to build and deploy machine learning models into user-facing applications.

Function App keys are placed in the azure-webjobs-secrets folder in Blob Container. If this folder is missing, this could mean that the Function App is unable to connect to the storage account referenced by the Function App Application Setting “AzureWebJobsStorage”. This could happen either because of a network misconfiguration or because of an issue on the storage side.

Java is one of the most popular programming languages, used by over seven million developers to create everything from enterprise applications to complex robots. 

Azure at Microsoft Build recap: build amazing things on your terms, anywhere

GCP:

Using AI-powered machine learning models to identify fraudulent unemployment claims

Google has won a deal to provide cloud services to Elon Musk's SpaceX, which has launched a slew of Starlink satellites to provide high-speed internet, it said on Thursday. SpaceX will set up ground stations within Google's data centres that connect to the Starlink satellites

How to leverage global address resolution using Cloud DNS in Google Cloud VMware Engine

Analyze your logs easier with log field analytics

With Datashare, data publishers, aggregators, and consumers can come together to exchange licensed datasets on Google Cloud securely, quickly, and easily.

Others:

CNA Financial, the US insurance conglomerate, has apparently paid $40m to ransomware operators to gets its files back.

Oracle Chairman and CTO Larry Ellison was seemingly omnipresent at the annual Oracle OpenWorld conference last week, providing his unique insights on subjects ranging from why autonomous technologies are so fundamentally important to what keeps him engaged after decades in the business.

Use o to radically accelerate your cloud operations workflow. Spend less time searching the docs and say goodbye to the days of copy-and-pasting those long OCIDs.

TimescaleDB is a time series data plug-in for PostgreSQL. Its version 1.5 enables automatic compression.

Alibaba’s Winning Cloud Formula Is Coming Under Pressure

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 2

Thu, 2021-05-27 20:47

 Welcome to the next edition of weekly Cloud Vanity. With all the hype around Cloud computing, you might think that everyone is already in the cloud but that's not really true. It's still just the beginning and a long way to go. So don't think it's already late to jump on the bandwagon. If you are thinking about shifting to cloud career then do it now. If you are a company thinking about moving workloads to cloud then do it. Because there is no other option if you want to survive.


AWS:

With the launch of AWS Distro for OpenTelemetry, AWS will continue to help drive advances in observability technologies, enhancing innovation and scalability for the entire OpenTelemetry community by contributing 100% of all changes to the upstream.

CloudEndure Migration can move applications from any physical, virtual, or cloud-based infrastructure to AWS at no charge. This complements AWS Server Migration Service (AWS SMS), which is an agentless service for migrating on-premises workloads to AWS. And now we have AWS Application Migration Service for lift and shift migrations.

Monitoring SQL Server is an essential aspect of any relational database management system (RDBMS) when dealing with performance problems. 

A common practice when creating a data model design, especially in the relational database management system (RDMS) world, is to start by creating an entity relationship diagram (ERD). Afterwards, you normalize your data by creating a table for each entity type in your ERD design. 

Even if you don't like AWS IAM, drop by to wish it happy birthday. Yes AWS is celebrating or marking the birthday of AWS Identity and Access Management (IAM). 

Azure:

Infrastructure-as-code tools like ARM templates and Terraform are more and more used to deploy infrastructure solutions. In general, these tools run within the context of a service principal so there needs to be an account with high privileges – at least high enough to deploy a given type of resource in your cloud environment. 

Software available under the partnership includes Oracle WebLogic, Oracle Linux, and Oracle Database, as well as interoperability between Oracle Cloud Infrastructure (OCI) and Azure. 

With the ever-increasing adoption of cloud-based solutions, and the incredibly complex make-up of the application architectures; the ability to effectively manage, orchestrate, and monitor the scenarios for search, security, and operations are becoming very critical for the success of the businesses.

Modern web app design patterns leverage microservices best practices for performance, agility, and extensibility. Azure Static Web Apps is a turnkey service for these modern full-stack web apps with pre-built and pre-rendered static front-ends, and serverless API backends. 

One detail updating today is the Azure “A” icon, which will be rolled out in product experiences and across related sites in the coming weeks. The new Azure icon represents the unity of Azure within the larger Microsoft family of product icons.

GCP:

Today’s healthcare organizations are grappling with mountains of data, increasing regulations, and escalating customer expectations. To add to these, healthcare organizations deal with highly sensitive personal data that needs to be protected.

Since its launch in 2016, the Google Cloud Public Datasets Program has provided a catalog of curated public data assets in optimized formats on BigQuery and Cloud Storage in partnership with a number of data providers.

A data cloud offers a comprehensive and proven approach to cloud and embraces the full data lifecycle, from the systems that run your business, where data is born, to analytics that support decision making, to AI and machine learning (ML) that predict and automate the future. 

VPC Flow Logs is one such enterprise-grade network traffic analysis tool, providing information about TCP and UDP traffic flow to and from VM instances on Google Cloud, including the instances used as Google Kubernetes Engine (GKE) nodes.

Others:

This post covers setting up API Gateway logging, setting up Logging Analytics, setting up Service Connector Hub to send API Gateway logs to Logging Analytics, and creating a Dashboard. Prerequisites include a working knowledge of OCI API Gateway and OCI in general.

Applications and use cases continue to evolve around data and enhanced storage needs in the cloud. Organizations building their own infrastructure and storage solutions to address the variable demands of their applications is more expensive and complex. Oracle makes storage seamless, inexpensive, high-performing, and flexible to support a wide range of use cases without sacrificing enterprise capabilities.

Businesses spend billions of hours a year on work that strips people of time and keeps them from focusing on higher-value things. AI-powered Automation helps people reclaim up to 50% of their time, and that’s something we all need.

The word “automation” often reminds people of assembly lines and manufacturing processes. But in a digital world, automation isn’t about the delivery of goods — it’s about making every interaction, experience and process more intelligent and impactful. This helps companies deliver value to their customers and gain a competitive advantage in their industry.

Financial institutions around the world are dramatically accelerating digital transformation. In the financial services industry, over 36 billion customer records were exposed in Q3 of 2020. The IBM Cloud for Financial Services provides a way for banks and financial institutions to migrate workloads to the cloud platform.

Categories: DBA Blogs

How to Enable SSH Equivalency Between EC2 Instances

Tue, 2021-04-20 01:55

 If you want to login to a Linux instance from other Linux instance without password or without mentioning the key, then ssh equivalency is the solution. 

Normally, in order to generate ssh equivalency between 2 Linux instances, you create both public and private keys, then copy them over to other instance and add it to authorized_keys file etc. 

But in EC2 instance in AWS, you have to create or specify the keys during the launch time of instance. When you launch an EC2 instance, public keys are already present in home directory of the user. For example, for Amazon Linux , the public key would be already present in /home/ec2-user/.ssh/authorized_keys file. That is why, you only need the private key to ssh into that server.

Let's say you have another EC2 instance which is Linux based and you want to establish ssh equivalency between these two instances. Let's suppose both are using the same key-pair. It means that both would already have public key present in their /home/ec2-user/.ssh/authorized_keys file. In that case all you need to do is following on both servers to establish ssh equivalency:


1- Login to Instance 1

2- Go to /home/ec2-user/.ssh/ directory

3- Touch a new file

touch id_rsa

chmod 700 id_rsa

4- Copy the content of your pem key and paste it into this id_rsa file

Now you should be able to ssh to the other server, which has the same keypair.

Repeat above steps on other server if you want to enable reverse ssh equivalency.

Categories: DBA Blogs

Where to Put PostgreSQL in AWS

Thu, 2021-04-15 22:44

When it comes to putting PostgreSQL database in AWS, you are spoiled for choice. There are 3 ways to do that:



1) Install and configure PostgreSQL on EC2 instance.

2) Amazon RDS for PostgreSQL

3) Amazon Aurora for PostgreSQL

You can watch the whole video here.

Categories: DBA Blogs

One Reason to Run Oracle on Google Cloud Platform

Wed, 2021-03-17 02:29

There is one reason to run Oracle on Google Cloud Platform, one solid and compelling reason. It has nothing to do with cost, and it has nothing to do with performance.

In all fairness, you can get cost savings (or not) with any of cloud provider in terms of software and hardware. But if you are or have to run Oracle, then probably cost is not your issue. For me, one differentiating reason is presence of Google Big Query in GCP. 

A serverless, fastest, easiest and very powerful data warehouse GCP BQ is an attraction of its own if you compare it to other competing cloud offerings. I am observing more and more companies drawing to GCP just to use BQ as unified warehouse of their data. Companies are using ETL, ELT tools and flows to push data into BQ from all sorts of databases and data stores on AWS, OCI and Azure. 

So if you have a choice, then why not put your Oracle database on GCP VM using their bare metal? If you even mention that to your GCP sales rep, very strong chances are that he will get a very good discount for you. Be sure to mention that you intend to integrate other GCP services with that Oracle database in the future and you might get bare metal for free. That's my guess, but there is no harm in trying.


Categories: DBA Blogs

Compartments in OCI

Sat, 2021-03-13 21:23

 One of my favorite concepts in Oracle Cloud Infrastructure (OCI) is compartments. If you have worked in AWS, at first they may seem redundant and cumbersome, but contrary to that, they are quite useful and make things less cluttered. 

I think if AWS would get a chance to reorganize their cloud governance model, they might also introduce something like that but then they don't like to copy thing. 

Compartment is used to organize your cloud resources like compute instances, buckets, etc. Compartments are a global concept and they span multiple regions. You can connect your resources across your regions within the same compartment.

The OCI account is called as Tenancy. When you create a tenancy, you also get a default compartment which is called as 'root compartment'. Of course, you can also create many other compartments too.

One of the biggest advantage of OCI compartment is that they enable you to do cost control of your cloud resources. You can assign budgets, quotas, and cost tags to the compartment and its resources. You can attach policies to them and that enable you to control the access in a unified and centralized way. All you have to do is to design the layout of the resources.

Categories: DBA Blogs

Solution of Nuget Provider Issue with PowerShell and AWS Tools

Wed, 2021-02-24 20:08

 On a AWS EC2 Windows 2012 server, my goal was to write some data to S3 bucket. I was using a small Powershell Script to copy the file to the S3 bucket. For that I needed to Install AWS Tools for Powershell and I used following command at Powershell prompt running as administrator:

Windows PowerShell

Copyright (C) 2016 Microsoft Corporation. All rights reserved.


PS C:\Users\SRV> Install-Module -Scope CurrentUser -Name AWSPowerShell.NetCore -Force

and it failed with following error:

NuGet provider is required to continue

PowerShellGet requires NuGet provider version '2.8.5.201' or newer to interact with NuGet-based repositories. The NuGet

 provider must be available in 'C:\Program Files\PackageManagement\ProviderAssemblies' or

'C:\Users\SRV\AppData\Local\PackageManagement\ProviderAssemblies'. You can also install the NuGet provider

by running 'Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force'. Do you want PowerShellGet to install

 and import the NuGet provider now?

[Y] Yes  [N] No  [S] Suspend  [?] Help (default is "Y"): y

WARNING: Unable to download from URI 'https://go.microsoft.com/fwlink/?LinkID=627338&clcid=0x409' to ''.

WARNING: Unable to download the list of available providers. Check your internet connection.

PackageManagement\Install-PackageProvider : No match was found for the specified search criteria for the provider

'NuGet'. The package provider requires 'PackageManagement' and 'Provider' tags. Please check if the specified package

has the tags.

Solution:

The solution is to enable TLS 1.2 on this Windows host, which you can do by running Powershell in administrator mode:


Set-ItemProperty -Path 'HKLM:\SOFTWARE\Wow6432Node\Microsoft\.NetFramework\v4.0.30319' -Name 'SchUseStrongCrypto' -Value '1' -Type DWord


Close your Powershell window, and reopen as administrator and check if TLS protocol is present by typing following command on PS prompt:

[Net.ServicePointManager]::SecurityProtocol

If the above shows Tls12 in the output, then we are all good and now you should be able to install AWS Tools.

I hope that helps.




Categories: DBA Blogs

Boto3 Dynamodb TypeError: Float types are not supported. Use Decimal types instead

Mon, 2021-02-22 01:26

 I was trying to ram data into AWS dynamodb via Boto3 and the streaming failed due to following error:


  File "C:\Program Files\Python37\lib\site-packages\boto3\dynamodb\types.py", line 102, in serialize

    dynamodb_type = self._get_dynamodb_type(value)

  File "C:\Program Files\Python37\lib\site-packages\boto3\dynamodb\types.py", line 115, in _get_dynamodb_type

    elif self._is_number(value):

  File "C:\Program Files\Python37\lib\site-packages\boto3\dynamodb\types.py", line 160, in _is_number

    'Float types are not supported. Use Decimal types instead.')

TypeError: Float types are not supported. Use Decimal types instead.



I was actually getting some raw data points from cloudwatch for later analytics. These datapoints were in float format which are not supported by Dynamodb. Now instead of importing some decimal libraries or doing JSON manipulation, you can solve above with simple Python format expression like this:

"{0:.2f}".format(datapoint['Average'])

It worked like a charm afterwards. I hope that helps.
Categories: DBA Blogs

Main SQL Window Functions for Data Engineers in Cloud

Fri, 2021-02-19 22:36

 To become a data engineer in cloud requires to have a good grasp of SQL among various other things. SQL is the premier tool for interacting with data sets. At first it seems daunting to see all those SQL analytics functions, but if you start with a tiny dataset like in the examples below and understand how these functions work, then it all becomes very easy for large datasets of any volume.

Once you know the basic structure of SQL, understand the basic clauses, then its time to jump into the main analytics functions. Below I have used SQL's With clause to generate a tiny dataset in Oracle. You don't have to create a table, load it with sample data and play with it. Just use with clause with the accompanying select statements which demonstrate you the common SQL Window functions.


1- In this example, sum and row_number functions works on each row of whole window.

   

With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,sum(t) over () as SumEachRow, row_number() over (order by t) as RN from x;


2- In this example, sum and row_number functions works on each row of each partition of whole window. This window is partitioned on column t.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,sum(t) over (partition by t) as SumEachRow, row_number() over (partition by t order by t) as RN from x;


3- In following example, we have divided the window into 2 partitions by using case statement within partition clause. One partition is when t=1, and other partition is composed of rest of rows.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,sum(t) over (partition by CASE WHEN t = 1 THEN t ELSE NULL END) as SumEachRow, row_number() over (partition by CASE WHEN t = 1 THEN t ELSE NULL END order by t) as RN from x;


4- Below example is variant of example 3. In this the window function row_number is working on whole window instead of partition whereas the window function sum is working on partitions.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,sum(t) over (partition by CASE WHEN t = 1 THEN t ELSE NULL END) as SumEachRow, row_number() over (order by t) as RN from x;


5- This example uses lag function to return previous value of window function. For lag function, the value for first row is always null as there is no previous value.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,lag(t) over (order by t) as Previous_t from x;


6- This example uses lead function to return next value of window function. For lead function, the value of last row is always null as there is no next value.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,lead(t) over (order by t) as Next_t from x;


7- This example shows that First_value function returns first value in window for each row.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,first_value(t) over (order by t) as First_t from x;


8- This example shows that First_value function returns first value in each partition of window for each row.

With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,first_value(t) over (partition by t order by t) as First_t from x;


9- This example shows that last_value function returns last value in window for each row.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,last_value(t) over (order by t ROWS BETWEEN

           UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING) as Last_t from x;


10- This example shows that Last_value function returns last value in each partition of window for each row.

With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,last_value(t) over (partition by t order by t ROWS BETWEEN

           UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING) as Last_t from x;


For explanation of rows between unbounded clause, see this 

11- This example shows the rank() function which is useful for Top N, or Bottom N sort of queries. Following is for whole window. The main idea is that rank starts from 1 from first row and then rank remains same for rows with same value within window. When value changes, the rank increments as per number of lines from top. 

With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,rank(t) over (order by t) as Rank from x;


12- This example shows the rank() function which is useful for Top N, or Bottom N sort of queries. Following is for each partition of window.


With x as ( 

   SELECT 'tom' as name, 1 AS t from dual

   UNION ALL

   SELECT 'harry' as name,2 AS t  from dual

   UNION ALL

   SELECT 'jade' as name,2 AS t  from dual

   UNION ALL

   SELECT 'ponzi' as name,3 AS t  from dual

)

select name,t,rank() over (partition by t order by t) as Rank from x;


PS. Yes I know formatting of code chunks is not good enough but this is limitation of blogger platform it seems and another note to self that I need to move to a better one.

Categories: DBA Blogs

Docker Behind Proxy on CentOS - Solution to Many Issues

Thu, 2021-01-28 22:50

If you running docker behind proxy on CentOS and receiving timeout or network errors, then use below steps to configure proxy settings on your CentOS box where docker is installed and you are trying to build docker image:

Login as the user which is going to build image


Create directory with sudo

    Sudo mkdir -p /etc/systemd/system/docker.service.d


Create file for http proxy setting

    /etc/systemd/system/docker.service.d/http-proxy.conf

    and insert following content into it:

    [Service]

    Environment="HTTP_PROXY=http://yourproxy.com:80/"


Create file for https proxy setting

    /etc/systemd/system/docker.service.d/https-proxy.conf

    and insert following content into it:

    [Service]

    Environment="HTTPS_PROXY=https://yourproxy.com:80/"


Restart the systemctl daemon

systemctl daemon-reload


Restart the docker:

service docker restart


Also if you are trying to install Yarn or NPM within your dockerfile , then within your docker file define following environment variables

ENV http_proxy=http://yourproxy.com

ENV https_proxy=http://yourproxy.com

ENV HTTP_PROXY=http://yourproxy.com

ENV HTTPS_PROXY=http://yourproxy.com


Notice that only specify http protocol both for https and http proxy. 

I hope that helps.

Restart docker again.


Categories: DBA Blogs

Pages