AWS, Cloud, and AI - Pioneering Tomorrow's Landscape
top of page

150 items found for ""

  • Advanced Direnv for Devops

    Now you are using Direnv to manage your workspaces, but you stumble upon a difficulty, maybe motivated (like me) by Terraform workspaces. You have a single codebase for a project, but this codebase needs to be deployed to different environments. Now, we can no longer have a .envrc file in the project's folder, because the environment should be different depending on where we want to deploy. So, how do we solve this? Direnv to the rescue, again. Using .env's Direnv has a directive called dotenv, which can be used to load a set of variables dynamically. So in our .direnv file, we can write: dotenv .env.${ENVIRONMENT} Then we create two files, .env.dev and .env.prod. So, each time we enter the directory, the filename of the environment to load is built with the variable $ENVIRONMENT (dev or prod in our case). Carlos Barroso Senior MLOps Engineer teracloud.io

  • Direnv for DevOps

    For DevOps engineers, working on many different projects and different clouds is a reality of life. We have developed many tools to help us manage and switch between the different contexts and projects, each one with its ups and downs. Today we explore the one that, IMHO, is the best of them all: direnv. How it works Direnv has a very simple premise: each time you cd into a directory with a .envrc file, a new environment is started for that particular directory using the environment from that file. When you leave the directory, the environment is re-established. Technically, this works by starting a subshell and injecting the variables defined on the file in this new subshell, but we won't dig into details for now. For example, if all the resources on a project exist in the eu-central-1 region, I have this file at the root of the project export AWS_REGION=us-central-1 So each time I cd to this directory, my aws-cli will operate on the us-central-1 region by default. How I use it In a fast-paced work environment, you require extra work and care to use your admin superpowers securely, so my goal is to minimize the work while keeping the security at the highest. For this, I keep a folder structure like this: Workspaces/ /ClientA .envrc /Project1 .envrc /Project2 .envrc /ClientB /ClientC #+end_example On the root of each client, I define the global settings, like the SSH keys I'm using for that client or the git id. Then, in each project folder, I define the environment specific for that project, which often includes the project name or location of kubeconfig files for Kubernetes. Carlos Barroso Senior MLOps Engineer teracloud.io

  • Digital Transformation, and Resilience

    This Covid-19 crisis put our society and digital resilience to the test, that is, the ability to overcome this critical moment and adapt to this unusual and unexpected situation using digital media. Technology has been a key factor in this time of crisis. Thanks to it, remote work, virtual education, telemedicine, and telecare have been enabled. Artificial intelligence has even been used to identify those infected with COVID-19. Human beings are complex entities that are part of organizations that face digital transformation processes. Two very important questions that organizations must ask themselves are: How are people going to adapt to the changes brought about by a digital transformation? And even more, how will they adopt new technologies that sometimes completely change how we have been developing processes within organizations? To answer the first one, we focus on understanding that the technologies of the 4th industrial revolution such as AI, VR, BLOCKCHAIN, IoT, and even the same cloud, with which we coexisted for so long, are complex to understand by most baby boomers, while methodologies, business, and stability, are concepts that for a large percentage of Millennials, do not make much sense. The great challenge for organizations is to unite in digital transformation projects the strengths of collaborators from two worlds so far away, on the one hand, the EXPERIENCE of the baby boomer and on the other the DIGITAL ADOPTION of Millennials. And this means for the Human Talent areas so that both generations participate, recognize the best of the other, and co-create based on their strengths. The success of these strategies depends on the people. In this sense, one of the key competencies to face these processes and motivate others is RESILIENCE. So, from the point of view of psychology and according to the definition of the Royal Spanish Academy of Language (RAE) "it is the human capacity to flexibly assume extreme situations and overcome them", but in the business field, it’s also about being innovative to generate ideas and implement plans that lead to emerging stronger from these adverse situations. For this, companies must identify the people who are more RESILIENT within their team so that they can lead it and hire people with this key competence in the digital age. In this way, it will be easier to answer the question of: How are they going to adopt new technologies that sometimes completely change how we have been developing processes within organizations? A company needs to build resilience in all aspects of the organization, from its go-to-market approach to its operations and its most critical infrastructure. If your organization is digitally resilient, it can anticipate, respond, learn, and evolve in the framework of events beyond its control and influenced by technological development. It is even capable of finding business and growth opportunities where traditional companies would only find failure. That is why at Teracloud as AWS Select Consulting Partner and certified cloud experts in migrating and deploying startups, enterprises, and everything in between to the cloud, we care about taking your business to the next level, taking into account the needs of each one. , transforming and improving to offer better results. We consider ourselves a resilient company because we are clear about the values ​​of our organization and we practice them as a team; In addition, we have a realistic perspective, which helps us to advance daily with achievable goals, which keeps us busy and motivated. We are a group that is constantly transforming and growing, hand in hand with flexibility and success.

  • Using GCloud service accounts in Terraform

    Now that you are comfortably using ServiceAccounts to interact securely with GCP, are you still not using it? Refer to this Teratip Secure your access to GCloud cli with Service Accounts and start doing so, you want to use it with Terraform too. Terraform requires setting a Token, which gives it access to the GCP API using a different identity. This token can be obtained with the gcloud cli and then exported to a variable. Once you do this, terraform will pick it up automatically and use it for every operation - even for the state reading. These tokens are short-lived -1 hour by default, so decrease our attack surface. In summary, you need to use Terraform like this for it to work every time: GOOGLE_OAUTH_ACCESS_TOKEN=`gcloud --impersonate-service-account=${SERVICE_ACCOUNT} auth print-access-token` terraform Alternatively, you can just export the variable once and then use the normal terraform commands for one hour. After that time you need to request a new token and export it again: export GOOGLE_OAUTH_ACCESS_TOKEN=`gcloud --impersonate-service-account=${SERVICE_ACCOUNT} auth print-access-token. If you are interested in learning more about our TeraTips or our blog's content, we invite you to see all the content entries that we have created for you and your needs. Carlos Barroso Senior MLOps Engineer teracloud.io

  • Secure your access to GCloud cli with Service Accounts

    Do you want a time-sensitive way to give access to a third party to your GCP account with a low administrative burden? Look no further! Set up a service account! How to do it It's actually very simple: Create a new service account, and give it the permissions needed by the third party Ask the third party for a Google Identity Add this identity to the service account with the TokenCreator permissions Profit! Now the third party needs to execute the gcloud command with an additional parameter, --impersonate-service-account = . All API calls will be done with this service account identity. *PROTIP:* If you set the variable CLOUDSDK_AUTH_IMPERSONATE_SERVICE_ACCOUNT, you don't need to add the aforementioned parameter, as gcloud will honor it automatically. Carlos Barroso Head of AI teracloud.io

  • How to keep your AWS Keys Safe on your terminal

    One of the causes of AWS Keys leaks is configuring your AWS CLI by using the command aws configure; This can lead you to store your credentials in plain text under your home folder. If you are using Linux or macOS, you may know about the Keychain and D-bus secret service, so let’s combine both things into a secure and robust solution to keep your keys safe. We will use https://github.com/sorah/envchain as the primary tool to automate the keychain unlock, export the values into a subshell, and allow you to use them securely until your process finishes. Clone and install the envchain tool from the sources at Github Create a namespace to store your secure env vars (you can have multiples, one per client, per environment, per project, etc.) envchain --set production \ AWS_ACCESS_KEY_ID \ AWS_SECRET_ACCESS_KEY \ AWS_REGION Replace production with the name of your environment / AWS account Start using it by adding the prefix envchain production aws sts get-caller-identity Optional you can get into a subshell by doing this. envchain production bash Remember to close the session to remove your secrets from the env vars. More information about https://github.com/sorah/envchain https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html https://rtfm.co.ua/en/what-is-linux-keyring-gnome-keyring-secret-service-and-d-bus/ https://es.wikipedia.org/wiki/GNOME_Keyring https://support.apple.com/guide/mac-help/use-keychains-to-store-passwords-mchlf375f392/mac Don't stop here! You may be interested in reading How to Deploy IAM Conditional Policies with Terraform Damian Gitto Olguin AWS Hero Teracloud.io

  • Digital Nomads, Independent Location

    We were forced to work from home by the pandemic. For some time now, our living room has become an office, the dining room table, a desk, and surely some room into a virtual meeting room. This new normal is a way of understanding the world and the relationship between the personal and professional spheres. Remote work has become something that many of us love, it is the time of digital nomads and even more so now that technological advances allow us to broaden our horizons and work from anywhere in the world; many companies give their employees the freedom to move without having a fixed place and having the possibility of seeing incredible places, the only requirement is to be well equipped. You must take into account the following recommendations and tools to be able to do a good job and avoid headaches. Laptop External hard drive and storage Powerbank Online platforms Power adapter Today, with how advanced the Internet is, with how globalized everything is, almost any skill we have, can be exercised from anywhere in the world, remotely. Are you good at drawing? Are you good at programming? Today almost everyone needs a website and those who don't need their maintenance why not, a redesign? Currently, many digital jobs are highly valued and in great demand, especially those that have to do with technology, such as: Software developer: a web developer has a variety of possibilities to work online and be a digital nomad, be it front-end, back-end, or full-stack developer. Cloud architects: they manage the infrastructure and services in the cloud of a company creating platforms and storage solutions over the Internet. Business intelligence analyst: business intelligence transforms data into information to optimize company profitability and analyze market trends. Cybersecurity Specialists: These professionals are in charge of data, network, information, and cloud security to protect companies and users in their daily operations. Machine learning specialist: machine learning engineers work in this branch of Artificial Intelligence (AI) for computers to identify complex patterns among millions of data through algorithms. Blockchain developers: Blockchain experts allow transactions to be carried out reliably and securely, without the need for an intermediary, and represent an opportunity for companies to seek new business opportunities. In short, thanks to the Internet and the pandemic situation, a new conception of more independent and free work, where the essential thing is to work for objectives and not so much in the number of hours spent sitting in the office, having the option to do so from anywhere and feel better. More than a working modality, it is a way of life that brings interesting benefits and is an increasingly common trend in different countries thanks to the possibilities offered by digital tools. Our DevOps team is made up of people who have different nationalities and are located in different parts. Through good communication, clear objectives, and a joint vision, we function very well and every day we grow more without placing location barriers, always immersed in the wonderful adventure of the cloud world. Don't stop here! You may be interested in reading Our Recruiting process: We Hire Character, We Train Skills

  • Build Docker containers on Kubernetes with Jenkins and Kaniko

    This writeup documents the current best way to build Docker containers within transient Jenkins agents inside a Kubernetes cluster. This setup has unique features and unique caveats you need to consider, and intend to save you, dear reader, the time I invested trying different solutions. Setup These are the components used in this setup. You can replace the K8S provider and the solution and considerations will hold. You can also change the destination registry for your images, and this will require a change in the authentication method. GKE cluster GCR for storing images and cache layers Kaniko official docker image to build Dockerfiles Kubernetes and GCP Service Account to provide credentials to the workers following the Principle of Least Privilege. Jenkins official docker images for Master and for the Worker Why not docker build? The docker command requires a working docker daemon, which requires setting up several components, customizing the Jenkins docker images, and more work. Using Kaniko allows us to use the official images and to avoid a lot of work. The resulting images are very similar to the ones build by docker and also totally compatible. In a nutshell The steps for this setup are: Create the Kubernetes cluster Create a ServiceAccount on GCP with StorageAdmin privileges to be able to read and push images to the registry. (This may not be needed depending on your setup) Create a ServiceAccount in Kubernetes Join both ServiceAccounts Use helm to install the Jenkins Master. Do not use the Jenkins Controller as it is broken at the time of this writing. helm install Jenkins-ci jenkinsci/Jenkins. Add this code at the top of your Jenkinsfile: pipeline { agent { kubernetes { //cloud 'kubernetes' defaultContainer 'kaniko' yaml """ kind: Pod spec: serviceAccountName: jenkins-sa containers: - name: kaniko image: gcr.io/kaniko-project/executor:debug-539ddefcae3fd6b411a95982a830d987f4214251 imagePullPolicy: Always command: - sleep args: - 9999999 """ } } Add this line to your Jenkinsfile to build and upload your image: sh '/kaniko/executor -f `pwd`/Dockerfile -c `pwd` --cache=true --destination=:$CI_COMMIT_TAG' After executing this job, your container will be building and uploaded to your GCP registry. For ECR or other registries, you need to set up a different authentication mechanism. Would you like to receive our newsletter with more TeraTips? Leave us your comments. Carlos Barroso Senior MLOps Engineer Teracloud

  • Our Recruiting process: We Hire Character, We Train Skills

    The famous phrase "Join our Team!" does not represent our recruiting process. We want you to join our Teracloud family, but we want us to join your professional life! In one way or another, we are changing and innovating in all our processes related to Talent. We are taking a turn in everything related to Talent management and the recruitment and selection process is not far behind. We break with the traditional interview scheme Rather, we consider them exchange calls. We do not believe in the positions of "interviewee" and "interviewer". Our main challenge is to be able to transmit our culture to you as much as possible. It is about giving you an overview, that you know where we stand, what we do, and especially how we are. That you know our culture, our way of working and that you can get along, what is a day like in Teracloud? Also, and why not? We consider them as part of a learning process where we as a company and I, as a recruiter, learn from the candidates and from this sector that grows stronger and faster every day. Can I tell you what the whole process is like? We start with a first contact (which can go from us to you or from you to us) where we coordinate a video call, which we generally do through Meet. In the call, I will tell you about Teracloud, its history (how it became an AWS Select Consulting Partner), its Team, its ways of working, but above all, and what I think is most important to our culture. I tell you that we are a great team and I make the comparison (so that you can get a closer idea) with a family, eating a Sunday barbecue. That is for us, incredible as it may seem, every day. Because of COVID and everything that followed, we continue working remotely and evaluating new possibilities to meet again. But in the same way and, working virtually, we stay close. Very close, in the "all hands" every day, where the first 10 minutes we use for small talks, to replace those minutes it took us to prepare coffee or fill the bottle of water in the office. Also, I tell you that once a month, we attend the “Own It”. A meeting of all Teracloud Teams, DevOps, and Social, where we chat about each other's projects and work. We have spaces designed to provide ideas, make contributions and we take the opportunity to celebrate people! After all our presentations, I ask you to tell me what you are up to? What do you like to do, what do you enjoy, what are your professional expectations and I listen to everything you have to tell me. If we continue to move forward, it does not take more than 1 week that us coordinate our next call together with our CTO or a member of our Technical Team. There we talked about how the day-to-day of our DevOps is. What are our projects? And EVERYTHING about what we are so passionate about, which is the infinite world of Cloud. We focus our process on knowing all the candidates that we can Regardless of the "seniority" as many ask me, what I always tell you is that we are not focused on the famous and recognized seniority but we bet on the learning capacities that each candidate possesses. Their capacities to grow, learn, develop, and above all the desire, motivation, and passion with which they work every day. We believe that each call we have is a new door that opens for us to meet and have an opportunity to work together. In this process, we make decisions together and we also want you to choose us to grow and learn together! I invite you to follow us on our Instagram and our Talents profile on LinkedIn. There you will be able to learn even more about our culture! And if you are interested in our opportunities, send us an email to talents@teracloud.io. We always have new challenges! Florencia Sánchez Talent Manager Teracloud If you want to know more about our services, tips, blogs, or a free assessment email our team member ben@teracloud.io #Teracloud #recruitingprocess #aws #awslatam #SocialteamTeracloud #DevOps #cloudcomputing #keepgrowing #hirecharactertrainskills #Terablog

  • My personal journey into the Cloud!

    The words “Cloud Computing” are used far too often these days, and you may not be familiar with this technical term and the nuances behind it, as it happened to me a year and a half ago. Over the past 18 months, I have been working for an IT company, Teracloud, and now I can safely tell you the meaning of Infrastructure as Code, Machine Learning, data migration, HIPAA Compliance Law and Privacy Rules, and many other concepts related to the Cloud Computing environment. I’m Victoria and I specialize in Digital Marketing. The other jobs I have held were totally unrelated to the Cloud. Of course, I knew who Jeff Bezos was when I arrived at Teracloud, but I had no idea that Amazon developed cloud services. I was only aware, and very well, that Amazon was an e-commerce platform (my apologies to Jeff, DevOps, DevSecOps, and SysAdmins!!!). This job gave me the possibility of going into the Cloud Computing world little by little. One month after I began working, my teammates and I traveled to the AWS Community Day, which took place in Buenos Aires for the first time, in order to cover the event on social media. “AWS Community Day events are community-led conferences where event logistics and content is planned, sourced, and delivered by community leaders. They feature technical discussions, workshops, and hands-on labs led by expert AWS users and industry leaders from around the world. Community Day events deliver a peer-to-peer learning experience, providing developers with a venue for them to acquire AWS knowledge in their preferred way: from one another. In many ways, they are events put on ‘by the community, for the community”1. We attended the keynote speech of the Vice President & Chief Technology Officer at Amazon, Werner Vogels, and met important personalities of AWS Latam, such as the AWS Country Manager - Argentina at that moment, Andrea Cerqueira, AWS Heroes, and many others. During my time in Teracloud, I learned a lot about this community. I’ve met a lot of interesting people in several events, like workshops and #AdminBirras, organized by the DoctArmy group. The DoctArmy group organizes tech meetups at Córdoba, Argentina, in order to gather professionals in the field to share their ideas and experiences; every first Friday of the month, DoctArmy brings together members of the IT community through an informal space where IT people know each other sharing beers, #AdminBirras. My work team and my bosses cheered me up to continue this learning journey into Cloud Computing and Amazon Web Services. Teracloud, as an AWS Select Consulting Partner, gave me the opportunity to improve my knowledge by attending AWS courses. That could sound simple enough, but it was actually quite challenging. I made up my mind and last year, despite the nervousness and inner tension I felt during the process, I achieved “AWS Business Professional” and “AWS Solutions Training for Partners: Foundations - Business” accreditations. Yeah!!! To complete these digital courses I had to attend several video lessons and took exams; both of them helped me to understand the fundamentals of AWS, showing me their tools and teaching me how the business is designed. The “AWS Business Professional” accreditation course provided me with a basic understanding of key Amazon Web Services products and services, the AWS Partner Network (APN), and core AWS business values for partners. Besides, the “AWS Solutions Training for Partners: Foundations - Business” accreditation course allowed me to learn about techniques and best practices for AWS solutions that solve customer challenges. Every day, I learn and get surprised with new things and concepts of this fascinating world of delivery of computing services. Cloud Computing covers the fields of servers, storage, databases, networking, software, analytics, and Artificial Intelligence over the Internet (that’s the Cloud!). With the services it offers, a business could be able to scale and run its infrastructure more efficiently. This year, my main challenge will be to ensure my “AWS Cloud Practitioner” certification. So… any help people can or want to offer would be gladly accepted. I keep on going on my journey into the Cloud! If this blog has piqued your interest, I suggest you get started on your own journey to the Cloud! You may be interested in reading INTERNAL WORKSHOP: KUBERNETES References 1 What are AWS Community Days? https://aws.amazon.com/ - Retrieved April 7, 2021, from https://aws.amazon.com/es/events/community-day/ Victoria Vélez Funes Social Media Manager Teracloud

  • Using SSM Parameter Store

    Some configurations can be considered private and high risk. Data such as Database Passwords and other valuable information can be safely stored in the SSM Parameter Store service. The service offers the possibility of storing the data that we consider to be “secret”, to later be consumed by our applications. Some of its features are: Serverless, scalable Version tracking of the configurations and secrets Encryption with KMS (optional) Notifications with CloudWatch Events Configuration management using IAM and path There are 3 types of parameters: String, StringList, and SecureString The parameters can be saved in the form of a hierarchy, for example: /department/ frontend/ dev/ Url-app Db-password prod/ Db-password The service has 2 tiers: Standard(free) and Advanced(paid). These are some of the characteristics of each tier: Standard Can store up to 10,000 secrets The maximum size of a parameter value is 4KB (a really long value) Storage pricing is free Advanced Can store up to 100,000 secrets The maximum size of a parameter value is 8KB $0.05 per parameter per month Here is an example of how to create our parameters: 1. First, log in to the AWS console, and in the search bar, you can filter for example by the “parameter” word, then click on Systems Manager. 2. On the left side, click on “Parameter Store”. 3. Then click on “Create parameter”. 4. Here we create one parameter of the previous hierarchy. and then create on “Create parameter” at the bottom of the page. Our first parameter was created. Now we are going to create the second parameter, but this will be a SecureString. Also, we must select the KMS key source, in this case, we will be using the KMS key that Amazon provides to us (alias/aws/ssm). Insert the value and then click on Create parameter again. Now I have my two parameters with their values. What do you think about this? leave us your comments. You may also be interested in reading Quick AWS Region change

  • Google Cloud: Importing resources from the Gcloud to IaaC in Terraform

    Managing resources in the cloud can be frustrating if the infrastructure is too big or too complex. That’s why we suggest having the resources expressed as Infrastructure as a Code(IaaC), like Terraform among others. But what IF our infrastructure was already created manually? How can we get those resources imported into Terraform? Well, our prayers were heard: Google brings us a pretty easy solution with the help of the command-line interface, with just one single command: Gcloud alpha resource-config bulk-export --resource-format=terraform >> You just need 3 things to make this command work: You must have installed the google-cloud-SDK-config-connector package in your system Make sure you have your google cloud credentials configured. Cloud Asset API enabled Then just run: ❯ gcloud alpha resource-config bulk-export --resource-format=terraform >> test.tf And let the magic happen: Exporting resource configurations… And that’s all, all your Google Cloud infrastructure will be imported as IaaC from the cloud with just one command! Hope this tip helps you! We will continue posting valuable content that will help you with your daily work and tasks. You may be interested in reading https://www.teracloud.io/single-post/2019/10/24/my-experience-with-aws-china Leandro Mansilla DevOps Engineer Teracloud

bottom of page