Learning Cloud 016: Slack Chat Bot

Are you a slacker,  have you been slacking recently?  If you not you should be slacking more often.

This is just a little play on words because today obviously we’re talking about Slack.  And of course I don’t mean being lazy or putting things off but I mean the Social media tool Slack.

What is Slack?

Slack is a cloud-based team collaboration tool, In simpler terms, it is a chatting program;  think of  Skype, Twitter.

Slack was created by Daniel Stewart Butterfield.  He’s a Canadian entrepreneur, best known for being a co-founder of the photo sharing website Flickr.  By the way, he’s worth 1.69billion.

How does Slack work?  Basically you can create a private group so that you can chat with only members of that group.  Also there’s no charge to using Slack.

Slack differs to Twitter in that generally your tweets are visible by everyone, unless it’s a personal Tweet.  Slack focuses on the private interaction and allows you to embed pictures and video into your message.

Slack uses the following terms to create groups.

  • Team and Workspaces
  • Channels
  • Messages

In this episode I want to introduce you to the Qwiklab where you will build a chat bot for Slack.

Chat bots have the ability to interact with teams and users, respond to commands, and post notifications, giving all conversation participants visibility into team activities.

You will build a bot that posts CloudWatch alarms to your Slack channel.  The high level steps are

  • Create Slack account
  • Create Webhook in slack
  • Create a lambda function.
  • select the Blueprint for slack
  • enter the web hook
  • Create the python code
  • test it
  • Create cloud watch alarm

You can find the detail steps in this Link.  You’ll have to sign up to be able to see the instructions

https://qwiklabs.com/focuses/3444 

Remember “If you’re not willing to learn no one can help you but if you’re determined to learn no one can stop you

Bye bye for now, until next time.

 

-Raf

Learning Cloud 015: Generating thumbnail via Lambda

“AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you making it easy to build applications that respond quickly to new information.”

Gartner presented at work and one of the things they said is how Serverless would be the future of Cloud.

Potential of Lambda: Can do any task related to your infrastructure or App configuration

  • Launch instances
  • Load data to a DB
  • Read and process messages from a queuing system

The beauty is that you only have to write the code.

Netflix explains how the company can use event-based triggers to

  • automate the encoding process of media files,
  • validate backup completions
  • instance deployments at scale

Main Purpose:

The one use case i’ll be showing you today is using Lambda to manipulate images, in particular how to create thumbnails.

The process to upload images to S3 is convenient. Downloading the same and using them on the website isn’t as simple. We need thumbnails (multiple sizes, mind you!) and probably a standard size in which we would like to showcase the uploaded images.

I’ll be referencing a Qwiklab

Steps

  1. Login to account and select Lambda service
  2. Create a Lambda function   Blueprint or “Author from Scratch”

       Select Language: Languages supported by Lambda:

      • Python, node.js, C# Java
      • Everything I’ve read suggests that Python provides fastest performance
      • Runs code in milliseconds of an eventPaste Code: Make up of lambda function:
      • Library definition
        • boto3 is the AWS SDK for python
      • Functions
      • Handler
  1. Connect to S3 and create a bucket
  1. Upload an image to the bucket. this will trigger lambda function

https://qwiklabs.com/focuses/2966

http://karlcode.owtelse.com/blog/2017/03/15/python-is-faster-than-node-dot-js/

https://aws.amazon.com/solutions/case-studies/netflix-and-aws-lambda/

https://aws.amazon.com/solutions/case-studies/netflix-and-aws-lambda/

Learning Cloud 014: How to Manage Logs in AWS

 

Hi there and welcome to the 14th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon

I’m excited to be back Podcasting.  It’s been a while since I last recorded.  I needed time to clear my mind and organize my thoughts.  But hey I’m back and I’m glad and it feels good.  It feels like riding a bike.  You don’t forget it.

Today I want to talk about Logs, in particular how to manage Logs in the cloud.

Intro

Working with Logs is a challenging task.

Computing environments generate massive amount of log files that are filled with all kinds of information: System Logs, application logs, security logs.  These files are being generated everyday, all year round.  Resulting in  multiple GBs of unstructured plain text data.

In most cases, manually going through plain log files, grepping for particular strings is daunting.

So we turn to professional applications to help us in this task.

There are products that are the Leaders in this market. They are Splunk and Elk

 

Splunk

  • – An Enterprise product sold and supported by Splunk Inc.  Targets large enterprise Market.
  • – Used a lot by security groups for long term archival.  To do forensics.
  • – It’s costly

 

Elk

  •  – An Opensource product.  ELK is an acronym for ElasticSearch, Logstash and Kibana.  
  •  – You can buy support from Elastic.
  •  – Since it’s ELK is an opensource it’s used by different companies. Mostly small but also large
  •  – I’ve seen it used in production to track app metrics in a more dynamic way

 

Transition:

Now I want to switch my focus to the cloud.  The loggin solution in AWS is called Cloud Watch.

Amazon Cloud Watch Logsis a managed service that enables you to store your EC2 logs in the cloud.

How does it Work

  • Agent
  • AWS Console

Some features

  • -Archiving/retention
  • -Metrics: see failed login attempts in system log
  • -Alarms

 

Demo you should try

If your’re interested in getting some hands on practice with CloudWatch logs this is easy.  This is why I love working with AWS

Launch an EC2 instance and send logs to it

  1. Launch an instance, with appropriate role
  2. Install the agent
  3. Configure the agent:
    1. Region
    2. Log file
    3. Log group name
    4. Stream name
  4. Watch the logs

 

I’ll put the link to the cloud watch Logs user guide where you can get more information.

Closing

That’s it for today. Please leave a comment on the post if you have enjoyed it., I love to hear your thoughts.  That’s it for today and remember

“If you’re not willing to learn no one can help you, if you’re determined to learn no one can stop you.”

 

Bye bye for now, and till next time.

Links

http://blog.takipi.com/splunk-vs-elk-the-log-management-tools-decision-making-guide/

http://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/cwl-ug.pdf

 

Learning Cloud 013: How to analyze your AWS monthly bill.

Welcome to the 13th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’LL be your host.

In today’s show I want to talk about how to analyze your AWS account balance.

  • Amazing service.  All the things you can do.
    • Start a Compute instance
    • Deploy resources in different continents
    • Deploy your own WordPress website.
    • but this is not free.

But it’s not free. Your AWS account is linked to your credit card.

So you need to keep watch of your what resources you consume, especially if your account is a personal account for learning purpose.

In AWS you’re paying for infrastructure resources as an utility so you have to watch how much you’re consuming.

 

  • Console -> Account Name -> My billing Dashboard
    • Here you can see your monthly balance
    • It displays a break down per resource.
    • It doesn’t show you what the break down per day.

 

  • My billing Dashboard -> Cost Explorer
    • select Monthly cost by service
      • Select ‘Day’ to view daily balance.  This will show you the service
      • Specify the Time Range: Last 7 days.
      • This gives you a very detail view.  Even if you’re paying 25c per day this will add up.

 

My EC2 charge was high but my instance was stopped.  but why cause my EC2 instance was stopped.  Ahh, I found a running instance in another AZ

Found RDS Snapshot

 

Learning Cloud 012: Identifying all your AWS resources

Welcome to the 12th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and Il’ be your host.

In today’s show I want to talk about how to identify all the resources you have configured in your AWS account.

 

  • Why this is important?
    • Helps in tracking your costs.
    • AWS offers offers you PAAS .  You consume resources as an utility.  Instead of paying for one big annual hosting contract, you pay piecemeal for individual resources.
    • You pay for the size of the resource, and the number of resources.
  • Why is it challenging to do?  You only see the resources within a region.
  • We’ll do this using the tag editor function.
  • What are tags?  Tags are metadata on a resource.  Tags enable you to categorize your AWS resources in different ways, for example, by purpose, owner, or environment. This is useful when you have many resources of the same type — you can quickly identify a specific resource based on the tags you’ve assigned to it.  Each tag consists of a key and an optional value both of which you define.
  • Process
    • Open tag editor: Console -> Resource Groups -> Tag editor
    • Filter regions: select all
    • Filter resource types: select all
    • Find resources
    • Sort by Resource Type:

 

Remember to sign up for the live stream of AWS Reinvent.  The keynote event is on Tuesday Nov 29.

That’s it for today.  As always remember that “If you’re not willing to learn no one can help you, but if you’re determined to learn no one can stop you“.

 

-Raf

Learning Cloud 011: Installing the AWS CLI on macOS

Welcome to the 11th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and Il’ be your host.

 

In today’s show I want to talk about installing the AWS CLI on macOS.  This is a simple task but it touches on a number of configurations that give you a good starting point as you start using AWS.

First I’d like to remind you that AWS re:Invent 2016 is Sun Nov 27 through Friday Dec 2.  EVENT is Sold OUT

Sign up for Live Stream and you’ll be able to watch the keynotes with Andy Jassy and Werner Vogels.

 

The prerequisites to complete this task we’re discussing are

  • An AWS Account
  • A Mac system

 

There are three ways that you can access AWS:

  • Console
  • Via SSH using key pairs
  • Programatically using access keys.

 

Reason for doing this:

  • You can do a lot of stuff in the GUI but to start using AWS professionally you’ll want to use various tools to automate the tasks you perform via the Console.
  • Some of these tools are: CloudFormation, Opsworks, Terraform and Packer.

 

In my case I was working with Packer to create a customized AMI.  The idea is that I would set up Packer on my system at home define and run configuration.  And then packer would login to AWS and run all the commands to start a new instance, do some work on it and then create a new AMI based on that image.

Today we’re just focusing on setting up the access to AWS using the CLI, which Packer will use later.

I have written a detailed Article showing how to do this.  You can find the link on this show notes.

 

Remember to sign up to re-invent Live Stream.

 

Thanks for listening and remember “If you’re not willing to learn no one can help you but if you’re determined to learn no one can stop you”.

 

Till next time.

 

-Raf

Learning Cloud 010: How to do DevOps with Richard Whitney

Welcome to the 10th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and Il’ be your host.

Today we’re back with an exciting interview.  Our guest is a Sr. Dev Ops engineer.

 

Some of the things we discussed are:

  • What’s the difference between a DevOps engineer and a Developer
  • His Journey from a junior developer to a Sr. DevOps Engineer
  • His side projects what technologies he is playing with

 

I’m sure you’ll enjoy this conversation, so without any further ado here the interview with Richard Whitney.

 


I hope you have enjoyed this interview with Richard Whitney I know I learned a lot and I look forward to having him back on the show in the future.

 

That is it for today and remember “if you’re not willing to learn no one can help you but if you’re determined to learn no one can stop you”.

 

 

 

 

Links:

https://github.com/rb1whitney/rbwexample/blob/master/Vagrantfile

http://docs.the-startup-stack.com/databases/introduction/

Learning Cloud 009: Installing Chef on AWS

Welcome to the 9th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’ll be your host.

Today I want to spend some time to talk about installing Chef in the cloud.  This text is only the outline.  You need to listen to the Podcast for the full content.

 

  1. Built Chef Server on AWS last night
    1. What is Chef? It’s an automation tool, an orchestrator. It lets you define a configuration for a server and then it will keep it there.
    2. We use Chef at work to automate our cloud builds.
    3. Installing my own Chef server is something I should’ve done a  while back.
    4. With any server software solution learning starts by installing it yourself.  Watching a video doesn’t cut it.
    5. I’ve a visual learner so I had Pluralsight video as re-inforcement
  2. I had the option to spin up an instance totally configured with Chef pre-installed but there’s not as much learning with that.
  3. Built the environment
    1. Requirement: Chef server, Chef workstation and chef client
    2. This is where AWS really shined.  In past I wouldn’t get past this stage because of the Prep time required.
    3. I had the option to select different OS.  It was so easy to do in AWS
    4. I felt so empowered having knowledge of AWS
      1. I had the account
      2. I Knew how to work environment.
      3. I had my own VPC. My instances would get public IPs. I knew how to make sure the port access was there. I knew how I’d be able to connect to my instances with SSH keys.  I knew this was a test environment that I could destroy at any second. It’s all external to my system, not VBOX. Better yet this would all be free
  4. Error regarding memory. Put elasticity to practice.
    1. I needed to add more memory to my instance.
    2. Kicked off the installer and saw error. This put me in troubleshooting mode
    3. I looked at the reqs and they wanted 8g
    4. I stopped instance and resized it from t2.micro to t2.medium
  5. Instance was running over night and got no charges.
    1. I used t2.medium.  Free tier is t2.micro
    2. It’s amazing that I was able to do all that for free.

 

You can see the step by step instructions here

I hope you have enjoyed today’s episode please give us a rating in iTunes and leave a comment in our website learningcloud.tech.

Remember “If you’re not willing to learn no one can help you, but if you’re determined to learn no one can stop you

Bye bye for now and until next time.

-Raf

Learning Cloud 008: Learn DevOps with Eric Vanderpoel

Welcome to the 8th episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’ll be your host.

 

We have a great show for you today.

Second guest joining us. Funny I say second guest.  That’s because the show is young I’m sure that eventually I won’t be numbering our guests.

As a brief introduction let me just say that our guest is a personal friend of mine and you’ll hear another bit about how we know each other. I’ll leave that as a surprise for now.

He is a seasoned veteran in IT and an expert in DevOps and Cloud computing. And an all around nice guy.  So I’m please to introduce you to Eric VanderPoel from BarclayCardUS.

 


 

I hope you have enjoyed this interview with Eric VanderPoel.  As a disclaimer, let me just add that while Barclaycard is a great company to do business with and to work for, the views expressed in this show are mine own and my guests.  This show doesn’t have any legal association with BarclayCard.

Lastly if you have enjoyed the show please give us a rating in iTunes and leave a comment in our website; learningcloud.tech.

Remember “If you’re not willing to learn no one can help you, but if you’re determined to learn no one can stop you

 

Bye bye for now and until next time.

 

-Raf

 

Learning Cloud 006: How Spreaker is running their business on AWS with Jacopo Nardiello

Welcome to the Sixth episode of the Learning Cloud podcast. The place for all things Cloud and DevOps.  My name is Raf Calderon and I’ll be your host.

Today we’re gonna change things up a little bit.  Instead of discussing what I’ve been up to I’m very excited to share the first interview for the show.

th

Our guest is a DevOps engineer at Spreaker which is a platform where you can listen to and create Podcasts.

Today you’ll here how Spreaker is running their Infrastructure in the cloud and about how they’re beginning to implement Docker.

So without further ado here’s the interview with Jacopo Nardiello from Spreaker

 

Closing

I hope you have enjoyed today’s episode.  Please leave a comment on the post, I love to hear your feedback.  If you enjoyed the show please give give us a rating in iTunes because this helps the Podcast to be found by other people. A FIVE Star  rating would be greatly appreciated .

That’s it for today and remember

“If you’re not willing to learn no one can help you, if you’re determined to learn no one can stop you.”

Bye bye for now, and till next time.

-Raf