Data Science Training Ahmedabad
08 November 2023

Want to Improve your Data Science skills? Join a Certification program for Career Growth

If you are interested in making a career in data science, then there is a high possibility to get a lucrative job role. There is a growing demand for skilled professionals in the IT industry. More and more business corporates are in need of data scientists who have updated competence in analyzing data. It can be standard or complex sets that help companies make more informed decisions by interpreting the data. Therefore, you should have Data Science Certification in Ahmedabad that proves your knowledge in data science knowledge and in which cases you can use them.

How joining Data Science Certification programs can be valuable?

Having a data science certificate might not be a comprehensive solution for all your IT skills needs. But if you have a career in the IT sector, it can help you get a better job role while growing your skills. You can grow your experience in understanding various data sets and stay updated on how to use and comprehend data that are required for current IT trends. So, you should improve your qualifications as per the needs of your job role as a data analyst. Data scientists can also gain more skills to use the right algorithm for business data training with Microsoft Azure Training & Certification in Ahmedabad. If you work with data on the AWS platform, you can get benefits from the AWS Security Training Course in Ahmedabad. You can learn about working with a range of cloud-based services.

Today, businesses in various industries depend more on data scientists to handle the increasing volume of data created and compiled. This is where data science can play an increasingly important role in a variety of sectors, offering a plethora of career possibilities. All you have to do is have the proper qualifications to become a data scientist. Hence, taking data science certification programs is worthwhile. HighSkyIT Solution offers data science certification sources that enable you to possess skills on both Azure and AWS. So you can effectively tackle all your data science projects.

Conclusion:

Data science is a dynamic field in the IT realm that offers chances for study and job advancements with ever-evolving technology and approaches. Hence, you should enroll in a data science certification program that enables you to stay current on the newest tools and trendy strategies. It will help you become a competitive and productive professional in the industry. Whether you want to become a data analyst or a big data engineer, to meet the need for evolving skills, taking on these certification courses can be helpful in career growth.

10 October 2023

What is Nodejs And install Nodejs

What is Node.js

An open-source server-side runtime environment for JavaScript called Node.js enables you to execute JavaScript code on the server. It is made for creating scalable and effective network applications, and it is widely used to create web applications, APIs, and other server-side programs.

1 JavaScript Runtime:- Node.js enables you to execute JavaScript code on the server, whereas JavaScript is typically associated with web browsers for client-side scripting.

2 Single-Threaded:- Although Node.js applications are single-threaded, they may effectively manage several concurrent tasks by using callbacks and asynchronous activities. For some applications, this may lead to better performance.

3 Event-Driven and Non-Blocking:- An event-driven, non-blocking I/O mechanism is the foundation of Node.js. This indicates that it can manage numerous connections simultaneously without delaying the execution of any code. Applications that demand a lot of concurrency and real-time communication are especially well suited for it.

4 Package Manager (npm):- The npm (Node Package Manager) ecosystem of open-source libraries and modules that come with Node.js allows developers to expand the capabilities of the framework and make the process of creating applications more straightforward. ion.

5 V8 JavaScript Engine:- The V8 JavaScript engine, created by Google and renowned for its great performance, is used by Node.js. V8 is quick and effective because JavaScript code is compiled into machine code. n.

6 Cross-Platform:- Node.js is extremely portable since it supports many different operating systems, including Windows, macOS, and many Unix-like platforms.

7 Large and Active Community:- A large number of resources, libraries, and tools are available for Node.js development because of the dynamic and active developer community that supports it.

The development of web servers and web applications, real-time applications like chat programs and online games, the creation of APIs, and the creation of command-line tools are just a few examples of common Node.js use cases. Its popularity has increased in part as a result of its adaptability in creating various applications and its ability to handle asynchronous I/O activities effectively.

install Node js

The Node.js official website provides a yum repository that must first be enabled on your system. Additionally, you require development tools to create native add-ons that may be installed on your system.

The default repositories of Rhel include a version of Node.js that can be used to deliver a consistent user experience across several platforms. Version 12.22.9 is what is currently available in the repository. While not the most recent version, this one should be reliable and sufficient for fast language testing.

1 You can use the yum package manager to obtain this version. First, update your local package index by typing:

yum update
yum install nodejs

To confirm installation, click Y when prompted. If you are asked to restart any services, hit ENTER to proceed with the default settings. Ask the node for its version number to confirm the installation was successful:

node -v

This is all there is to getting started with Node.js if the package in the repositories meets your needs. You should typically install npm, the Node.js package manager, as well. You can accomplish this by using yum to install the npm package:

2 You don’t need to install npm individually because the NodeSource nodejs package includes both the node binaries and npm.

By using apt and the NodeSource PPA, you have now successfully installed Node.js and npm. The installation and management of several Node.js versions are covered in the section that follows.

yum install npm

 

3 The Node Version Manager is Used to Install Node

The Node Version Manager, or nvm, is a further flexible method of installing Node.js. You can simultaneously install and maintain numerous independent Node.js versions and their corresponding Node packages with this piece of software.

The project’s GitHub page can be accessed to learn how to install NVM on a Rhel 9 computer. The README file is displayed on the main page. Copy the curl command from there. This will provide you with the installation script’s most recent version.

It is usually a good idea to audit the script to ensure it isn’t doing anything you disagree with before papering the command through to bash. Remove the | bash element from the end of the curl command to accomplish that:

curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh

 

Make sure you understand the adjustments it is making by taking a look. Run the command once more with | bash appended after you are finished. As of right now, the script can be downloaded and run by typing: The URL you use will change based on the most recent version of nvm.

curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash

The nvm script will be installed on your user account as a result. You need to source your.bashrc file first before using it:

source ~/.bashrc

You can now inquire NVM about the Node versions that are offered:

nvm list-remote

The list is pretty lengthy! Any of the release versions you see can be typed to install a particular version of Node. For example, you can enter the following to obtain version v16.14.0 (another LTS release):

nvm install v16.14.0

If you type: you can view the various versions you have installed.

nvm list

 

15 September 2023

AWS RDS instance start lambda function with Event bridge

1. Open the AWS Management Console: Go to the AWS Management Console and log in to your AWS account.

2. Choose RDS: From the list of AWS services, choose RDS (Relational Database Service).

 3. Click “Create Database”: On the RDS dashboard, click the “Create database” button.

 4. Choose a database engine: Select the engine you want to use for your RDS instance. Amazon RDS supports various database engines like MySQL, PostgreSQL, Oracle, SQL Server, MariaDB, etc.

5. Choose a use case: Select the use case that best fits your needs. This will determine the default settings for your RDS instance, such as the instance class, storage type, and allocated storage.

6 . Configure the instance: Configure the RDS instance by specifying its name, username, and password. You can also choose the instance type, storage type, allocated storage, and other settings based on your requirements.

7. Configure advanced settings: If needed, you can configure advanced settings such as backup retention, maintenance window, security groups, and VPC settings.

8. Launch the instance: After configuring all the settings, review your configuration and click “Create Database” to launch your RDS instance.

9. Please wait for the instance to launch: It may take several minutes for your RDS instance to launch. Once it is ready, you can connect to it using the endpoint provided in the AWS Management Console.

That’s it! You have now created an RDS instance in AWS. You can use this instance to host your database and connect to it from your applications.

IAM service policy

1. Open the IAM Management Console: Go to the AWS Management Console and log in to your AWS account. From the list of AWS services, choose “IAM” under “Security, Identity & Compliance”.

2. Create a new policy: In the left-hand navigation pane, click “Policies”, then click “Create policy”.

3. Select a policy template: On the Create Policy page, you can either create your custom policy or use a pre-defined policy template. To create a policy for RDS, you can select the “Amazon RDS” service from the list of available services.

4. Choose the actions: Next, you need to choose the actions that you want to allow or deny for this policy. For example, you might want to allow read-only access to RDS resources or grant permissions to create and modify RDS resources.

5. Choose the resources: Once you have selected the actions, specify the RDS resources to which this policy applies. You can choose to apply the policy to all resources or specify individual resources by ARN (Amazon Resource Name).


6. Review and create the policy: After specifying the actions and resources, review the policy details and click “Create policy” to save the policy.

7. Attach the policy to a user or group: Once you have created the policy, you need to attach it to a user or group that needs access to RDS resources. You can do this by navigating to the user or group in the IAM console, clicking on the “Permissions” tab, and then attaching the policy to the user or group.

That’s it! You have now created an IAM service policy for RDS and attached it to a user or group. The user or group can now perform the allowed actions on the specified RDS resources.

IAM service role

1. Navigate to the IAM dashboard.

2. Click on “Roles” from the left-hand menu.

3. Click on the “Create role” button.

4. Choose the type of trusted entity for your role: an AWS service, another AWS account, or a web identity provider.

5. Select the policies that define the permissions for your role. You can choose from existing policies or create a custom one.

6. Give your role a name and description.

7. Review your role and click “Create role” to save it.

That’s it! You have now created an IAM service role in AWS. You can use this role to grant permissions to an AWS service or other entities that need to perform actions on your behalf.

Lambda function

1. Navigate to the Lambda dashboard.

2. Click on the “Create function” button.

3. Choose the type of function you want to create. You can create a function, blueprint, or serverless application repository from scratch.

4. Give your function a name and description.

5. Choose a runtime for your function, such as Python, Node.js, or Java.

6. Configure the function’s execution role, which determines the permissions that the function has to access AWS resources.

7. Write your function code or upload a ZIP file containing your code.

 import boto3

# Initialize the RDS client
rds = boto3.client('rds')

# Start the RDS instance
try:
    response = rds.start_db_instance(DBInstanceIdentifier='your-db-instance-id')
    print('RDS instance starting...')
except Exception as e:
    print('Error starting RDS instance:')

8. Set up your function’s environment variables and any additional settings, such as memory and timeout settings. Click “Create function” to save your Lambda function.

After creating your Lambda function, you can test it manually or set up a trigger to invoke it automatically. You can also monitor your function’s performance and troubleshoot any errors using the AWS Lambda console.

  CloudWatch

1. Navigate to the CloudWatch dashboard.

2. Click on “Events” from the left-hand menu.

3. Click on the “Create rule” button.

4. Choose the “Schedule” option under “Event Source”.

5. Configure the croon expression for when you want the RDS DB  instance to start. For example, if you want it to start every day at 8 pm, you would use the expression 25 5 * *? *

6. Choose the EC2 instance as the target for the event rule.

 

7. Configure the specific action that you want to perform on the RDS DB instance, which in this case is to start it.

8. Give your rule a name and description.

9. Click “Create rule” to save your CloudWatch event rule.


After creating your CloudWatch event rule, it will trigger at the scheduled times and start the specified EC2 instance. Be sure to test your rule to ensure it is working as expected.

STOP THE RDS DB INSTANCE

1. creating IAM policy

2. creating IAM role

4. creating Lambda function RDS-stop-instance and attaching a role

8. creating CloudWatch Choose the “Schedule” option under the “Event Source” rule.

9. Configure the croon expression for when you want the RDS DB  instance to s. For example, if you want it to start every day at 8 pm, you would use the expression 10 6 * *? *


successfully RDS starts to stop the instance

Docker Certification in Ahmedabad
15 September 2023

Get Your Docker Certification Demystified For Container Mastery

Do you want to get your Docker certification to get an industry-recognised credential? To get recognition, you must pass the Docker Certified Associate (DCA) exam. It’s time to start with a specific course to improve your docker skills. Courses for Docker Certification in Ahmedabad are available at competitive prices; along with professionals guide the candidates. Let’s know more about the certification course.

What You Will Achieve with the Certification Course?

  • Digital certificate and Docker Certified Associate logo.
  • Recognition of Docker skills with official Docker credentials.
  • Accessthe Docker Certified professional network.

While preparing for your Docker certification exam, you have to cover major concepts related to Docker skills to become a proficient developer, application architect,and system administrator. Here are the concepts you will cover;

Running Containerised Applications

You will learn to run containerised appsfrom pre-existing images. This concept will help you to improve your programming and development skills by enabling you to spin up dev environments. There are centres for DevOps Online Training Ahmedabad where you can learn this concept.

Deploying Images in the Cluster

Another major concept where you can learnto achieve continuous delivery is by deploying images in the cluster in the form of containers.

Installation and Maintenance of Docker platform

This concept will provide you with a clear insight into the Docker platform. Here, you will learn to install and operate the platform. Moreover, you will also get an idea of its maintenance and upgrades. It will provide you with an insight into the internals of Docker.

Configuration and Troubleshooting

In this concept, you will learn to configure and troubleshoot the Docker engine. There are prominent Cloud Computing Certifications Ahmedabad that also offer Docker certification courses, where all these concepts are covered. When you dive deep into the core topics of configuration and troubleshooting, you will cover topics such as Orchestration, Installation and Configuration, Storage and Volumes, Image Creation, Management, and RegistrySecurity and Networking.

Other Concepts of Container Mastery

There are also other concepts to cover in your docker platform, such as understanding triage issue reports from the stakeholders and resolving them. Knowledge of new Docker environments and performing general maintenance. Also, you will learn to migrate traditional applications to containers. This concept will help to migrateyour existing apps as Docker containerised apps. You can consult Ansible Training Ahmedabad to learn about the Docker certification.

These are the major concepts covered in Docker certification courses. To know more about the course, DCA exam, and concepts, get in touch with HighSkyIT Solution.

10 August 2023

how to create S3 bucket using Terraform

To use Terraform to construct an Amazon S3 bucket, you must define an appropriate resource block in your Terraform setup. Here’s a step-by-step tutorial on creating an S3 bucket with Terraform:

1 Configure AWS Credentials:
Before you continue, make sure you have your AWS credentials set up. You can use the AWS CLI aws configure command or specify them as environment variables.

2 Follow these steps to create a Terraform configuration:
Create a.tf file (for example, main.tf) to define your Terraform setup.

3 Define the S3 Bucket:
Add the following Terraform code to your main.tf file to define an S3 bucket resource:

terraform {
  required_providers {
    aws = {
        source = "hashicorp/aws"
        version = "5.8.0"
    }
  }
}

provider "aws" {
  region     = "ap-south-1"
  access_key = "Your_Access_Key"
  secret_key = "Your_Secrt_Key"

}

resource "aws_s3_bucket" "bucket" {
  bucket = "highskybucket"

  tags = {  
    Name        = "My bucket"
  }
}

Replace “highskybucket” with your S3 bucket’s unique name. Bucket names must be globally distinct throughout AWS.

4 Launch Terraform:
To launch Terraform, browse to the directory containing your Terraform configuration file and execute the following command:

 terraform init

5 Plan the Configuration:
In HashiCorp Terraform, the terraform plan command generates an execution plan outlining the modifications Terraform will make to your infrastructure based on your existing configuration. Without actually making the changes, it demonstrates to you what steps Terraform will take to create, update, or remove resources, for example. By doing so, you may examine and confirm the modifications before implementing them in your infrastructure.

terraform plan

6 Apply the Configuration:
Run the following command to create the AWS s3 Bucket:

terraform apply

7 Review and Confirm:
Terraform will display a plan of what it aims to build. After reviewing the plan, type yes to confirm and create the AWS s3 Bucket.

Output On AWS Infra

And Go To s3 Service 

 

How To Create IAM User and assign policy to user by Terraform

How to create ec2 instance using terraform

Linux Certification Ahmedabad
19 July 2023

Continuing Education with Red Hat Staying Ahead in Open Source Technologies

In today’s rapidly evolving digital landscape, staying ahead in open-source technologies is essential for professionals seeking to excel in the field. With the vast popularity and significance of Linux administration and Red Hat technologies, it becomes crucial to equip oneself with the necessary skills and knowledge. If you are based in Ahmedabad, India, you’re in luck. A leading training provider offers top-notch Red Hat Training Course & Certification Ahmedabad designed to enhance your proficiency and open doors to exciting career opportunities.

Some features of Linux Administration with Online Classes in Ahmedabad

  • Comprehensive Curriculum:

This course provides a comprehensive curriculum that covers all aspects of managing and maintaining Linux-based systems. From basic concepts to advanced topics, you’ll gain a deep understanding of Linux architecture, command-line operations, user management, file systems, networking, security, and more. The curriculum is designed to equip you with the skills to handle real-world scenarios in Linux environments.

  • Flexibility and Convenience:

One of the primary advantages of online classes is the flexibility they offer. Whether you’re a working professional or a student, you can access the course materials and lectures at a time that suits you best. Companies like Highsky IT Solutions allow you to balance your learning with other commitments, making it convenient for individuals with busy schedules.

  • Interactive Learning Experience:

Engaging and interactive learning experiences are essential for effective comprehension and skill development. Through virtual labs, practical exercises, quizzes, and discussion forums, you’ll have hands-on opportunities to apply your knowledge, collaborate with peers, and seek guidance from experienced instructors.

  • Experienced Instructors:

To ensure a high-quality learning experience, Linux Administration Online Classes Ahmedabad are led by experienced instructors with extensive knowledge in the field. These instructors bring real-world expertise and industry insights to the virtual classroom, providing practical examples and guidance throughout the course.

  • Certification Opportunities:

Completing Linux Administration Online Classes may allow you to earn industry-recognized certifications. Choosing classes that align with recognized certification programs is essential to maximize the value of your learning journey.

Enhance Your Linux Expertise with RHCE, RHCSA, and Red Hat Training in Ahmedabad

In Ahmedabad, you can broaden your Linux administration skills through RHCE and RHCSA classes. These comprehensive programs offer a range of features to help you excel in Linux-based environments. RHCE RHCSA Classes in Ahmedabad provide in-depth knowledge and practical skills required to design, deploy, and manage Red Hat solutions effectively. Linux Training in Ahmedabad covers various topics such as system administration, network configuration, and security management. These institutions validate your expertise, enhancing your professional credibility. By enrolling in these programs, you can acquire valuable knowledge, hands-on experience, and potential career advancement opportunities in Linux administration.

Conclusion:

In a rapidly changing digital landscape, continuous education is vital for professionals seeking to stay ahead. Many offer a diverse range of online classes and training programs tailored to meet the demands of open-source technologies. By enrolling in Linux administration, Red Hat training, and certification courses, you can enhance your skill set and gain a competitive edge. Visit the highskyit.com website for more information and start your educational journey toward success.

05 July 2023

What Is OwnCloud & How To Install In Ubuntu 20.04

What Is OwnCloud?

Individuals and organizations can securely store, access, and share their files and documents using the self-hosted file synchronization and sharing platform known as OwnCloud. In contrast to cloud storage services like Dropbox, Google Drive, or OneDrive, it offers customers complete control over their data, serving as an alternative.

With OwnCloud, you may construct a private cloud storage solution on your own server or by using a hosting company. It offers capabilities like file synchronization between several devices, file sharing, group document editing, and data backup. The platform provides clients for many operating systems, such as Windows, macOS, Linux, Android, and iOS, enabling smooth file access from numerous devices.

One of OwnCloud’s main benefits is that you can store your data on your own servers or those of a reputable hosting company, guaranteeing that you will always have ownership and control over your files. Additionally, it offers options for encryption to increase security during file transfers and storage.

OwnCloud provides a number of extensions and plugins to enhance its functionality in addition to the essential file synchronization and sharing features. Task management, music streaming, calendar and contact synchronization, and interaction with other services like Microsoft Office Online or Collabora Online for group editing are a few of them.

Overall, OwnCloud offers a versatile and adaptable cloud storage option that enables people and organizations to manage their files, share information, and collaborate while still having complete control over their data.

How To Install OwnCloud In Ubuntu 20.04?

1 System Packages Update:-

Use the apt command below to update the system packages and repositories before you begin.

# apt update -y && apt upgrade -y

2 Install  Apache, MariaDB, And PHP Packages:-

How to install MariaDB and use MariaDB redhat

( 1 ) Apache:-  Apache Server is a free and open-source web server software that allows websites to be hosted on the Internet. An Apache server is a software program that is based on one computer and provides access to devices and websites on that computer to other computers on the Internet.

( 2 ) MariaDB:- Similar to MySQL, MariaDB is made to use tables, columns, and rows to store and manage structured data. It provides several programming interfaces and connectors for various computer languages, as well as SQL (Structured Query Language) for querying and modifying data.

( 3 ) PHP:- PHP is used to create OwnCloud, which is normally accessed through a web interface. To serve Owncloud files, as well as PHP  and other PHP modules required for OwnCloud to run efficiently, we will install the Apache webserver for this reason.

# apt install -y \
  apache2 libapache2-mod-php \
  mariadb-server openssl redis-server wget php-imagick \
  php-common php-curl php-gd php-gmp php-bcmath php-imap \
  php-intl php-json php-mbstring php-mysql php-ssh2 php-xml \
  php-zip php-apcu php-redis php-ldap php-phpseclib

3 By using the dpkg command after the installation is finished, you can check to see if Apache was installed:- 

# dpkg -l apache2

4 Run the commands to launch Apache and allow it to start automatically:-

( 1 ) Start:- Start Apache2 Service

# systemctl start apache2

( 2 ) Enable:- Use of this command  ” enable ”  automatically boot time. start Apache2 service

# systemctl enable apache2

( 3 ) Status:- Check service running

5 Check if PHP is installed. And version:-

# php -v

6 MariaDB Secure installation:-

MariaDB, just like MySQL is the default. secure Therefore, you must take another step and run the mysql_secure_installation script.

You are guided through a series of prompts by the Running command. You will need to create a root password first. The default root user unix socket authentication in MariaDB is insufficiently secure.

So, decline from using the Unix socket authentication by pressing  ” n ” and hitting

# mysql_secure_installation

NOTE: RUNNING ALL PARTS OF THIS SCRIPT IS RECOMMENDED FOR ALL MariaDB
SERVERS IN PRODUCTION USE! PLEASE READ EACH STEP CAREFULLY!

In order to log into MariaDB to secure it, we’ll need the current
password for the root user. If you’ve just installed MariaDB, and
you haven’t set the root password yet, the password will be blank,
so you should just press enter here.

Enter current password for root (enter for none): [Press Enter]

OK, successfully used password, moving on…

Setting the root password ensures that nobody can log into the MariaDB
root user without the proper authorisation.

Set root password? [Y/n]  [ Press Y ]

New password:                  [ redhat@123 ]
Re-enter new password:   [ redhat@123 ]
Password updated successfully!
Reloading privilege tables..
… Success!

By default, a MariaDB installation has an anonymous user, allowing anyone
to log into MariaDB without having to have a user account created for
them. This is intended only for testing, and to make the installation
go a bit smoother. You should remove them before moving into a
production environment.

Remove anonymous users? [Y/n] [ press Y ]

… Success!

Normally, root should only be allowed to connect from ‘localhost’. This
ensures that someone cannot guess at the root password from the network.

Disallow root login remotely? [Y/n] [ Press Y ]

… Success!

By default, MariaDB comes with a database named ‘test’ that anyone can
access. This is also intended only for testing, and should be removed
before moving into a production environment.

Remove test database and access to it? [Y/n] [ Press Y ]

– Dropping test database…
… Success!
– Removing privileges on test database…
… Success!

Reloading the privilege tables will ensure that all changes made so far
will take effect immediately.

Reload privilege tables now? [Y/n] [Press Y ]

… Success!

Cleaning up…

All done! If you’ve completed all of the above steps, your MariaDB
installation should now be secure.

Thanks for using MariaDB!

7 Create MariaDB Database:-

To store files both during and after installation, we must build a database for Owncloud. Therefore, log into MariaDB.

mysql -u root -p

Enter password: redhat@123

MariaDB [(none)]> CREATE DATABASE highsky_db;
MariaDB [(none)]> GRANT ALL ON highsky_db.* TO 'harry'@'localhost' IDENTIFIED BY 'redhat@123';
MariaDB [(none)]> FLUSH PRIVILEGES;
MariaDB [(none)]> EXIT

8 Download OwnCloud:-

wget https://download.owncloud.com/server/stable/owncloud-complete-latest.tar.bz2

9 Extract Directory:-

# tar -xjf owncloud-complete-latest.tar.bz2
# ls

10 Set Then permissions:-

# chown -R www-data:www-data owncloud
# chmod -R 755 owncloud

11 MV This Directory

mv owncloud /var/www/

12 Apache Configure for OwnCloud:-

We will set up Apache to serve OwnCloud’s files at this stage. To accomplish that, we will make the aforementioned Owncloud setup file.

# vim /etc/apache2/conf-available/owncloud.conf
Alias /owncloud "/var/www/owncloud/"

<Directory /var/www/owncloud/>
  Options +FollowSymlinks
  AllowOverride All

 <IfModule mod_dav.c>
  Dav off
 </IfModule>

 SetEnv HOME /var/www/owncloud
 SetEnv HTTP_HOME /var/www/owncloud

</Directory>

Save and close the file.

13 The next step is to run the commands listed below to activate all the necessary Apache modules and the newly added configuration:

# a2enconf owncloud

# a2enmod rewrite

# a2enmod headers

# a2enmod env

# a2enmod dir

# a2enmod mime

14 Restarting the Apache web server will make the modifications effective:-

systemctl restart apache2

15 Completing The Installation Of OwnCloud

The only step left to do is to install OwnCloud on a browser once all relevant configurations have been completed. Therefore, open your browser and enter the address of your server, followed by the

Username = admin
Password = admin

Database User = harry
Database Password = redhat@123
Database name = highsky_db

Username = admin

Password = admin

Successfully Install

 

28 June 2023

What is Docker? And How To Install In Ubuntu 20.04

Containers are standardized, executable components that integrate application source code with the operating system (OS) libraries and dependencies necessary to run that code in any environment. Docker is an open-source platform that empowers developers to build, distribute, operate, update, and manage containers.

The magic bullet that permanently fixed the virtualization and software container issues was Docker. Yes, that is a bold statement! Other products had made an effort to address these issues, but Docker’s novel strategy and ecosystem had completely eliminated the competition. You will learn the fundamentals of Docker in this course so that you can start utilizing it for your own applications and incorporating it into your workflow.

 1  Installing Docker

What Is Docker? How To Install Rehal 9

It’s possible that the Docker installation package included in the official Ubuntu repository is out of date. We’ll install Docker from the official Docker repository to make sure we have the most up-to-date version. To accomplish that, we will first create a new package source, then install the package after adding the GPG key from Docker to confirm the downloads are legitimate.

Update your current list of packages first:.

apt update

Install the following prerequisites to enable apt to use packages through HTTPS:

apt install apt-transport-https ca-certificates curl software-properties-common

curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -

Then add your system’s GPG key for the official Docker repository:

APT sources should include the Docker repository:

add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu focal stable"

Additionally, this will add the Docker packages from the recently added repository to our package database.

Verify that you are about to install from the Docker repository rather than the standard Ubuntu repository:

The result will look like this, albeit Docker’s version number can be different:

Note that docker-cue is not installed, but Ubuntu 20.04 (focal)’s Docker repository is the installation candidate.

Install Docker lastly:

apt install docker-ce

Now that Docker has been set up, the daemon should be running and the process should be set to launch upon boot. Verify that it is operating:

To verify that the service is operational and operating, the output should resemble the following:

The Docker client as well as the Docker service (daemon) are now included with the installation of Docker. Later in this lesson, we’ll look at how to use the docker command.

Install docker in complete 

Choose AWS CDK from HighSky IT to get a better Future
10 June 2023

Choose AWS CDK from HighSky IT to get a better Future

The AWS CDK or Cloud Development Kit is one of the powerful frameworks which helps developers to find out cloud infrastructure resources by using similar programming languages like Java, TypeScript, and Python.  HighSky IT offers The AWS Security Training Course Ahmedabad, which gives an invaluable resource for architects and developers looking to unlock the complete potential of the CDK.

This course is specially designed to offer comprehensive skills and knowledge to participants related to securing data and applications on the AWS platform.  Such kind of  Data Science Training in Ahmedabad provides valuable insights into best practices and different security measures that can help to protect AWS resources from different potential threats.  This post highlights some key takeaways from such a training course.

What can you learn from AWS CDK or Security Training Course?

  • Understanding AWS Security Services

The courses for Ansible Training Ahmedabad offer an in-depth understanding of different security services provided by AWS.  Here, participants can learn about different services like AWS CloudTrail, Identity and Access Management (IAM), Firewall Manager, AWS Key Management Service (KMS), AWS Config, and many more.  Having knowledge about such services can be utilized to improve the AWS environment’s security posture.

  • Identity and Access Management

The courses for AWS Security Certification Ahmedabad cover AWS IAM, which is one of the primary components of access control in AWS.  Here, learners can understand the best practices for implementing authorization mechanisms and secure authentication.

  • Securing AWS Infrastructure

In this AWS CDK, the participants can learn about best practices and important techniques for securing their AWS infrastructure.  It includes implementing the right access controls, configuring secure network architectures, and applying security policies in order to protect AWS resources.  The participants can also learn about encryption mechanisms, secure data storage options, and methods to secure transit data.

  • Incident Response and Compliance

This course gives proper guidance on responding to security incidents and creating an incident response plan in an AWS environment.  Here, participants can learn about AWS security best practices to respond to and mitigate common threats to security.  The learners can gain knowledge of industry regulations and compliance frameworks relevant to AWS, like PCI-DSS, GDPR, and HIPAA.

Apart from that, the course also helps the participants to understand the best practices and security optimization and monitoring and logging for securing AWS resources.

Conclusion

The AWS Security Training Course helps the participants with the skills and knowledge essential to implement security measures in their AWS environment.  If you want to learn more details on this course, then you can connect with Highsky IT Solutions to gain an understanding of securing infrastructure and AWS security services.

08 June 2023

How To Take RDS Snapshot by Lambda function with cloud watch scheduler?

1. Open the AWS Management Console: Go to the AWS Management Console and log in to your AWS account.

2. Choose RDS: From the list of AWS services, choose RDS (Relational Database Service).

3. Click “Create Database”: On the RDS dashboard, click the “Create database” button.

 4. Choose a database engine: Select the engine you want to use for your RDS instance. Amazon RDS supports various database engines like MySQL, PostgreSQL, Oracle, SQL Server, MariaDB, etc.

 

5 Choose a use case: Select the use case that best fits your needs. This will determine the default settings for your RDS instance, such as the instance class, storage type, and allocated storage.

6 . Configure the instance: Configure the RDS instance by specifying its name, username, and password. You can also choose the instance type, storage type, allocated storage, and other settings based on your requirements.

7. Configure advanced settings: If needed, you can configure advanced settings such as backup retention, maintenance window, security groups, and VPC settings.

8. Launch the instance: After configuring all the settings, review your configuration and click “Create Database” to launch your RDS instance.

9. Please wait for the instance to launch: It may take several minutes for your RDS instance to launch. Once it is ready, you can connect to it using the endpoint provided in the AWS Management Console.

 

That’s it! You have now created an RDS instance in AWS. You can use this instance to host your database and connect to it from your applications.

IAM service policy

1. Open the IAM Management Console: Go to the AWS Management Console and log in to your AWS account. From the list of AWS services, choose “IAM” under “Security, Identity & Compliance”.

2. Create a new policy: In the left-hand navigation pane, click “Policies”, then click “Create policy”.

3. Select a policy template: On the Create Policy page, you can either create your custom policy or use a pre-defined policy template. To create a policy for RDS, you can select the “Amazon RDS” service from the list of available services.

4. Choose the actions: Next, you need to choose the actions that you want to allow or deny for this policy. For example, you might want to allow read-only access to RDS resources or grant permissions to create and modify RDS resources.

5. Select Permission   ( Write )
(  CreateDBSnapshot )

6. Choose the resources: Once you have selected the actions, specify the RDS resources to which this policy applies. You can choose to apply the policy to all resources or specify individual resources by ARN (Amazon Resource Name).

1 db Represents a DB instance that is an isolated database environment running in the cloud

Click to restrict access.

Click Theis Account 

( 1 )  Resource Region 

ap-south-1a 

( 2 ) Resource db instance name

database-1 

And Click ( Add ARNs )

2 Snapshot Represents a snapshot that is a backup of the storage volume of your DB instance

to restrict access.

 

Click Theis Account 

( 1 )  Resource Region 

ap-south-1a 

( 2 ) Resource snapshot name

Highsky-Snapshot-name

And Click ( Add ARNs )

( 3 )  And Chick  Any in this account

Next

7. Review and create the policy: After specifying the actions and resources, review the policy details and click “Create policy” to save the policy.

8. Attach the policy to a user or group: Once you have created the policy, you need to attach it to a user or group that needs access to RDS resources. You can do this by navigating to the user or group in the IAM console, clicking on the “Permissions” tab, and then attaching the policy to the user or group.

That’s it! You have now created an IAM service policy for RDS and attached it to a user or group. The user or group can now perform the allowed actions on the specified RDS resources.

IAM service role

1. Navigate to the IAM dashboard.

2. Click on “Roles” from the left-hand menu.

3. Click on the “Create role” button.

4. Choose the type of trusted entity for your role: an AWS service, another AWS account, or a web identity provider.

Use case Allow an AWS service like EC2, Lambda, or others to perform actions in this account.

Click The Lambda

5. Select the policies that define the permissions for your role. You can choose from existing policies or create a custom one.

6. Give your Role a name and description.

7. Review your role and click “Create role” to save it.

That’s it! You have now created an IAM service role in AWS. You can use this role to grant permissions to an AWS service or other entities that need to perform actions on your behalf.

Lambda function

1. Navigate to the Lambda dashboard.

2. Click on the “Create function” button.

3. Choose the type of function you want to create. You can create a function, blueprint, or serverless application repository from scratch.

4. Give your function a name and description.

5. Choose a runtime for your function, such as Python, Node.js, or Java.

( A runtime is a version of a programming language or framework that you can use to write Lambda functions. Lambda supports runtime versions for Node.js, Python, Ruby, Go, Java, C# (.NET Core), and PowerShell (.NET Core)

To use other languages in Lambda, you can create your own runtime.

Note that the console code editor supports only Node.js, Python, and Ruby. If you choose a compiled language, such as Java or C#, you edit and compile your code in your preferred SDE and upload a deployment package to the function. ) 

Taking by Python 3.1 

6. Configure the function’s execution role, which determines the permissions that the function has to access AWS resources.

7. Write your function code or upload a ZIP file containing your code.

import boto3

def lambda_handler(event, context):
    rds_client = boto3.client('rds')
    instance_id = "database-1"
    snapshot_id="Highskysnapshot"
    try:
        response = rds_client.create_db_snapshot(
            DBInstanceIdentifier=instance_id,
            DBSnapshotIdentifier=snapshot_id
        )
        print(f"Snapshot '{snapshot_id}' creation initiated.")
        return {
                "snapshot_id" : response['DBSnapshot']['DBSnapshotIdentifier'],
                "status" : "started creating"
            }
    except Exception as e:
        print(f"Error creating snapshot: {str(e)}")
        return None

8. Set up your function’s environment variables and any additional settings, such as memory and timeout settings. Click “Create function” to save your Lambda function.

After creating your Lambda function, you can test it manually or set up a trigger to invoke it automatically. You can also monitor your function’s performance and troubleshoot any errors using the AWS Lambda console.

  CloudWatch

1. Navigate to the CloudWatch dashboard.

2. Click on “Events” from the left-hand menu.

3. Click on the “Create rule” button.

4. Choose the “Schedule” option under “Event Source”.

Click Continue To create rule 

5. Configure the croon expression for when you want the RDS DB  instance to start. For example, if you want it to start every day at 8 pm, you would use the expression 30 12 * * ? * 

6. Choose the EC2 instance as the target for the event rule.

7. Configure the specific action that you want to perform on the RDS DB instance, which in this case is to start it.

8. Give your rule a name and description.

9. Click “Create rule” to save your CloudWatch event rule.

After creating your CloudWatch event rule, it will trigger at the scheduled times and start the specified EC2 instance. Be sure to test your rule to ensure it is working as expected.

Successfully 

WhatsApp chat