Blog | Teracloud
top of page

149 items found for ""

  • How to Deploy IAM conditional policies with Terraform

    Nowadays, AWS is the top cloud provider around the world and has a wide variety of services that are provided to us One of the most important services is IAM (Identity and Access Management). Here, we can manage the correct Access to AWS services and resources in a secure way and the best part is this is a free feature, so there is no additional charge. The way to manage the Access and permissions is by creating IAM Policies. Once we have the policies created, the correct way to work is to assign them to groups, and then assign our users to these groups. This is a good practice to have our users organized and at the same time the policies assigned directly to the group to which they belong. Another good practice and advice when working with permissions is what is known as “least privilege”, which implies always granting only the MINIMUM permissions that are necessary for our users to operate correctly, and no more than that. Using Terraform (Infrastructure as Code) we can also deploy this, but sometimes, we need to create a policy that has more than one conditional, like the JSON code below: { "Effect": "Allow", "Action": [ "ec2:CreateTags" ], "Resource": "arn:aws:ec2:*:*:security-group/*", "Condition": { "StringEquals": { "ec2:CreateAction": "CreateSecurityGroup" }, "Null": { "aws:RequestTag/elbv2.k8s.aws/cluster": "false" } } } So here, the solution could be like this: statement { actions = ["ec2:CreateTags"] resources = ["arn:aws:ec2:*:*:security-group/*"] condition { test = "StringEquals" variable = "ec2:CreateAction" values = ["CreateSecurityGroup"] } condition { test = "Null" variable = "aws:RequestTag/elbv2.k8s.aws/cluster" values = ["false"] } What we do here, is separate each conditional with its variables and values. In this way, we can convert a JSON policy with more than one conditional to terraform.

  • Do you want to import an ECDSA certificate into AWS?

    Don't waste your time trying to import into ACM, Amazon Certificate Manager says ECDSA is supported but if almost impossible to import the certificate, so you just have to import it as an IAM certificate An example aws iam upload-server-certificate --server-certificate-name ecdsa-certificate-example --certificate-body file://Certificate.pem --certificate-chain file://CertificateChain.pem --private-key file://PrivateKey.pem For more information https://aws.amazon.com/premiumsupport/knowledge-center/import-ssl-certificate-to-iam/ Damian Gitto Olguin CTO Teracloud.io #Teracloud #aws #TeraTips #ECDSA #certificate #awslatam #DevOps Follow us for more TeraTips

  • Serverless deployment

    There are many ways to deploy infrastructure as code, but today’s Teratip is about a special one we like to use: Serverless. As with many IAC tools, we start by writing a text file and then running a binary that controls the creation of what we declared. But serverless has a control advantage over the infrastructure that requires AWS resources like Lambda functions or DynamoDB. In less than 50 lines of Yaml code, you can create a state-of-the-art infrastructure using S3 Buckets, DynamoDB, and more with all the required policies to keep it safe. For example, a Yaml like the following will create an S3 bucket, a DynamoDB table, and the infrastructure for the function to communicate them: service: quicksite frameworkVersion: ">=1.1.0" provider: name: aws runtime: nodejs10.x environment: DYNAMODB_TABLE: ${self:service}-${opt:stage, self:provider.stage}-uniqname iamRoleStatements: - Effect: Allow Action: - dynamodb:Query - dynamodb:Scan - dynamodb:GetItem - dynamodb:PutItem - dynamodb:UpdateItem - dynamodb:DeleteItem Resource: "arn:aws:dynamodb:${opt:region, self:provider.region}:*:table/${self:provider.environment.DYNAMODB_TABLE }" functions: create: handler: fn/create.create events: - http: path: fn method: post resources: Resources: MyBucket: Type: AWS::S3::Bucket Properties: BucketName: ${self:service}-${opt:stage, self:provider.stage}-uniqname AccessControl: PublicRead MyDb: Type: 'AWS::DynamoDB::Table' DeletionPolicy: Retain Properties: AttributeDefinitions: - AttributeName: id AttributeType: S KeySchema: - AttributeName: id KeyType: HASH ProvisionedThroughput: ReadCapacityUnits: 1 WriteCapacityUnits: 1 TableName: ${self:provider.environment.DYNAMODB_TABLE} Once you have your Yaml file, Serverless will compile it for Cloudformation making the full deployment of its content and keeping records for its modification in the future. Nice, isn't it? Give it a try. Start at https://www.serverless.com/ Let us know if you like Serverless and we’ll keep you updated with more Teratips about it. Juan Eduardo Castaño DevOps Engineer Teracloud If you want to know more about our services, tips, blogs, or a free assessment email our team member ben@teracloud.io #Teracloud #aws #serverlessdeployment #TeraTips #awslatam #deployment #lambda #DynamoDB #S3Buckets #infraestructure

  • Did you know there is a better way to connect to your AWS Linux Instances than SSH?

    For years, you used SSH as the only way to access your ec2 instances, surely you've exposed port 22 to anywhere (0.0.0.0/0) making your instances or bastion instances reachable from anyone on the internet. Some people prevent this by implementing VPN solutions, which increases the complexity and potential points of failure and tons of maintenance tasks. Your search ended right here, there is a great tool that allows you to connect to the Linux terminal, SSM Session Manager. This powerful tool adds great features and characteristics to improve your environment security: Removes the administrative tasks to manage ssh keys The authentication and authorization relies on your IAM You can connect to your instance using the web console or the AWS CLI Removes the needs to setup bastion hosts or VPN servers to connect to instances in private networks One-click access to instances from the console and CLI Provides logging and auditing session activity Supports tunneling: you can use a Session-type SSM document to tunnel traffic, such as HTTP or a custom protocol, between a local port on a client machine and a remote port on an instance. Where can I start? Here is the official documentation: https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager.html The short version is: Create an IAM role with an instance profile Attach the policy named AmazonSSMManagedInstanceCore to the role Attach an IAM instance profile to an EC2 instance as you launch it or to a previously launched instance. Verify that your instance meets the requirements Remove port 22 or the custom port associated with ssh from your instance On the EC2 web console, select the instance and go to the Actions menu, select to connect, then on the second tab select Session manager and finally, hit on connect. If you want to use your terminal: You need to meet the requirements and install the session manager plugin Run this command aws ssm start-session --target <> Now you know how to use this fantastic tool and improve your workload security, Give it a try, you will not regret it. Follow us on @Teracloud.io for more #TeraTips Damian Gitto Olguin CTO Teracloud.io

  • What does the Cloud bring to Companies in 2021?

    In business, the pressure on the workforce and responsibility keeps increasing. We have been in a pandemic for almost a year and many companies have been affected by the Covid-19 effect; many expect 2021 to be a year of recovery and improvement, to adapt to changes and take their businesses to another level. This is why to meet customer demands and work in real-time, companies have been adopting cloud computing. Cloud computing offers modern business advantages that include allowing multiple users to view data in real-time and share projects effortlessly. Instead of owning their own IT infrastructure or data centers, companies can rent access to anything from applications to storage from a cloud service provider. In this blog, we will show you some trends in the Cloud and what the cloud brings to companies in 2021. Increase in demand: The transition to the cloud is accelerating because it allows business models to quickly adapt to new consumption habits, behavior patterns, and new ways of working, relating, communicating, etc. Everything as a Service (XaaS): Companies will move towards a SaaS model to offer all services in the cloud. Thus, achieving more scalability, flexibility, and faster delivery times for both customers and the organization. Optimization of cloud operations: Companies that are already in the cloud have the goal of aligning spending with business objectives, taking advantage of the agility, services, and innovation that this technology provides. Automation: By law, automation is being implemented, because routine tasks can be eliminated, especially for technology departments. Support for innovation: Organizations that want to drive innovation by launching new digital services will see this technology as the basis to do so, as it allows them to take advantage of the potential of big data and analytics solutions or artificial intelligence and machine learning solutions. Cybersecurity: Important to consider security. The increase and severity of attacks have led companies to focus on this issue, forcing them to pay more attention to this issue with a forceful response from the industry, through alliances, training initiatives, and closing possible doors to cybercrime. From a technology perspective, companies and their IT partners must avoid configuration errors and unauthorized access, which are, today, the main threats identified by customers. As 5G begins to catch up in 2021, security will be even more challenging for cloud-based organizations. 5G is about increasing the speed and connectivity of IoT; Cars, voice assistants, portable devices, and even household appliances have network connections. But thanks to 5G, the possibilities of what these devices can do are now of much greater concern. 2021 promises to be another transformative year for IT, largely due to Covid-19 as its effects linger. However, one day, the coronavirus will disappear and the question facing most organizations, as they emerge from this era, will be how to wisely accept new challenges and new realities. Cloud computing, thanks to its versatility and ubiquity, maybe the answer you are looking for. Liliana Medina Community Manager Teracloud If you want to know more about our services, email our team member ben@teracloud.io #Teracloud #aws #AWSLatam #Cloudcomputing #DevOps #Cloud2021 #nextlevel #Tech #machinelearning #artificialintelligence #bigdata #cybercsecurity #IoT

  • Quick AWS Region change

    Happens very often that you have to manage an account with Console control. But when your infrastructure is multi region, you need to change the AWS_REGION environmental variable constantly and it is very easy to confuse their names because they don't change much, do they? Well, here you have a Teratip to make a descriptive variable change: Copy and paste the following function into your .bashrc file. function region(){ export AWS_REGION=$(dialog --stdout --menu "AWS Region" 15 70 8 \ us-east-1 Virginia \ us-east-2 Ohio \ us-west-1 California \ us-west-2 Oregon ); } You might need to install dialog to make it work. At the moment it is available for Linux and Mac users only. But perhaps you can find alternatives for Windows as well. The most important subject is the concept so you can adapt it to your daily basis. Then log in to a new console and type region to choose it. Feel free to modify it with the regions you actually use in your deployment. Come back later for more Teratips about Linux/Mac Console. Juan Eduardo Castaño DevOps Engineer Teracloud

  • Pending Pods: Limits in EKS

    It is annoying to finally deploy our application in an EKS cluster to get the eternal “pending” state of our pods. Several reasons might get a pod in a pending state; most of them are related to computing resource limits, but some others are related to IP address limits in our worker nodes. However, reaching this kind of limit is an easy problem to solve. You can find the right type of instance for your Kubernetes workloads by checking this handy document provided by AWS. https://github.com/awslabs/amazon-eks-ami/blob/master/files/eni-max-pods.txt to have the maximum number of pods per instance, and best of all, not only a few but all AWS EKS instances available. So, If you want to avoid some frustrations and headaches, that document will be a good starting point! Come back later for more Teratips about EKS. Leandro Mansilla DevOps Engineer Teracloud

  • What Constitutes a HIPAA Violation?

    Before cloud services can be used by healthcare organizations for storing or processing protected health information (PHI) or for creating web-based applications that collect, store, maintain, or transmit PHI, covered entities must ensure the services are secure. "For large organizations, the most common uses of the cloud are for hosting analytics applications and data (48%), hosting financial applications and data (42%), for operational applications and data (42%), and HR applications and data (40%). 38% were using the cloud for disaster recovery and backups".1 It is very important that you avoid some of the most common HIPAA violations: Sending texts containing PHI. Improper mailing or emailing of PHI. Failure to monitor and maintain PHI access logs. The omission of a HIPAA-compliant Business Associate (BA) agreement with vendors before allowing access to the information system containing PHI. Accessing patient information on a personal device or home computer. Inadequate or lack of limitations as to who may view PHI. Failure to remove access authorization to employees who no longer have a reason to access PHI. Poor training to ensure that employees understand the many HIPAA requirements and guidelines. Lack of documentation of HIPAA compliance efforts. Lost or Stolen Devices Therefore, if any device of a person that has access to PHI is lost or stolen, it is a direct violation of HIPAA. That is why it is vitally important to keep track of your mobile devices. It’s also worth having remote-wipe systems in place in case a device goes missing. Employee Disclosure of PHI Discussing a patient’s condition, medications, or any personal data with co-workers or friends is a direct violation of HIPAA regulations. ​ Improper Disposal of Medical Records ​ Electronic information that is deleted must be tracked and logged. When in doubt, employees should seek the advice and training of their IT or compliance team to properly dispose of PHI records. ​ Mishandling of Records Photocopiers are a high-risk zone for mishandling of PHI. Most photocopiers feature a storage drive that saves and collects a document to let employees retrieve it at their desk or to re-print at a later time. If the person creating the resulting document forgets to close their session, the following employee ​ Failure to Conduct a Risk Analysis ​ The HIPAA Security Rule and the HHS mandate that healthcare organizations perform a risk analysis. The risk analysis helps organizations discover opportunities and vulnerabilities in their computing system. If the results indicate issues with confidentiality, integrity, and availability of electronic PHI held by the healthcare organization, the organization may correct the issue. Left uncorrected, the findings may result in HIPAA violations. We hope you keep in mind that the top benefits for healthcare organizations when migrating to the cloud are performance and reliability, ease of management, the total cost of ownership, and infrastructure agility. ​ Contact Us to become HIPAA Compliant! Our team of security experts can help you! ​ References 1-https://www.hipaajournal.com/hipaa-compliance-cloud-computing-platforms/ Liliana Medina Community Manager Teracloud If you want to know more about our services, email our team member ben@teracloud.io #Teracloud #aws #HIPAA #HIPAAviolations #performance #costeffective #infrastructureagility #cloudsecurity #patientsafety

  • We decided to see the glass half full, Goodbye 2020 Welcome 2021!

    The world stopped. Yes, it might be a good way to start to describe it. We don't realize it or maybe we do, but we are writing the story of an unusual year. A year where for the first time in 115 the New York subway was closed. A year where the planes stayed on the ground. A year where humans took refuge in their homes and animals took to the streets. A year where pollution dropped for a while. A year where hugging with a friend or family member turned into an act of madness. A year where a black cloud was seen coming in the distance and ended up covering almost all of us. Like every year when a new year is about to begin, we do evaluations of the year that closes and projects for the one that begins, we never imagined that our realities were about to change as it happened in 2020, we all thought that this was going to be a great year, But the pandemic caught us by surprise and changed all our customs, especially those that we Argentines have so deeply rooted, we were left without: Meetings with friends No team sports Without leaving home for at least 3 months No kisses, no hugs Without shaking the hand that so much warmth gives to interpersonal relationships Without an office, perhaps one of the hardest things along with the greetings, for not being able to share a place with our colleagues Without the lunches in the office Without the After Office or Admin Birras No meetups/conferences For some without seeing their families for 9/10 months and the list can go on endlessly, but we also chose to see the full side of the glass, the side of Resilience that our team has, because here we are, here we continue day by day. We adapt to a New Normal See us and chat 10 minutes before stand up to see how our colleagues are To help us more To make the effort it takes to solve problems even when it belongs to someone else Be there to review a PR or to answer a question To continue training ourselves and do wonders in our homes to render the certifications To take over a small space in the house to turn it into our bunker To put what we have and more, to move forward in our projects To take the projects and return them as their own To share the knowledge To communicate more and better by multiple means On the way, we lost Colo and then Santi, but that is not why they will stop being TeracClouders, because they are part of this family and are there to help us even when they do not work with us. This year capicúa will not go unnoticed, that would seem to have already been clear to us. Perhaps in some regions, it is less noticeable than in others. But for the world, something has changed. For all that, and for much more, we continue to choose to see the full side of the glass. We are writing the year, it is important that we are aware of that. And let's learn. Hoping that this global impasse will soon end, meanwhile, history continues to be written. Thank you for joining us in 2020 and for many more successes in 2021. Happy New Year from all the Teracloud team! Damian Gitto Olguin CTO Teracloud.io #Teracloud #goodbye2020 #cloud #aws #awslatam #happynewyear

  • Meet the new shell in the cloud: AWS CloudShell

    Probably you've had a hard time setting up the AWS CLI, EB CLI, ECS CLI, etc in your local environment. Well, stop wasting your time and try AWS CloudShell: the definite shell in the cloud. AWS CloudShell is a browser-based shell available from the AWS Console. With just one click, CloudShell will provide a fully managed Amazon Linux 2 environment that has the latest versions of popular tools already installed. CloudShell also gives you 1GB of storage per region and the files stored in your home directory will persist between sessions. And the good news is you won't have additional charges because it's free. So, what are you waiting for? Let's start using it!

  • MLOps at AWS Re: Invent 2020

    Swami Sivasubramanian's Machine Learning keynote was full of exciting announcements of new tools and features. While the vast majority of them are industry breakthroughs, we choose just three to show you how they empower data scientists and reduce the time to market of ML solutions. These announcements have a common idea behind: MLOps - avoid spending money and valuable data scientists' time on infrastructure details to focus on the actual data analysis and model development. The first announcement - which is double really - is the inclusion of model creation tools along with two data-related tools: Redshift ML and Athena ML. This new service allows you to generate quick models trained and tuned using Autopilot features directly from your data sets. This of course is not intended to replace your data science efforts, but to complement them, to make validation and experimentation with models in much earlier steps of the Machine Learning Development Lifecycle (MLDL). With this service you can test your first hypotheses as soon as you have gathered your data, from the same console, so you can avoid taking it to later steps, reducing time and costs. On top of that, RedshiftML allows you to create, train, and implement machine ML models using familiar SQL commands - you can even use the models as SQL functions in your normal queries! The second big one is SageMaker Clarify, a tool that helps you detect biases on your training data AND in your running models. Bias is a new, burning-hot topic in the ML field because it can impact a business globally (think racial discrimination biases in job selection for example). SageMaker Clarify automatically evaluates your input data, offering reports about the biases in each dataset before you even train your models. It also evaluates a running model and produces a different set of reports that helps you understand potential issues regarding biases. Moreover, Clarify can constantly evaluate your models and raise alarms when the models bias indicators start drifting through the integration with SageMager Model Monitor. Lastly but not less important there is Amazon HealthLake. The official description states that “HIPAA-eligible service that enables healthcare providers, health insurance companies, and pharmaceutical companies to store, transform, query, and analyze health data at petabyte scale.”. This is a huge improvement for health-related organizations seeking to capitalize on their data while being compliant and protecting their patient’s information. You can combine the knowledge extraction capabilities of Comprehend Medical with the powerful analytics and the data lake processing capabilities of HealthLake to get insights impossible until now. The MLOps umbrella is a set of ideals, goals, and some tools and best practices to grease the gears of the machine learning development process. In the MLOps world, data scientists teams only focus on the actual data science, having tools that even automate data cleansing, computing power provisioning and models deployment and testing. In the next post we will use some of the new tools to train and evaluate ML models, and use the power of MLOps to do it quicker and cheaper than ever before. If you need assistance for improving your ML workloads or if you are just starting your ML journey, contact us at Teracloud, we are AWS Select Consulting Partners

  • INTERNAL WORKSHOP: KUBERNETES

    "It would be good to give an internal workshop on all this that Kubernetes has been working on" Me! a junior! To give a workshop, I couldn’t believe it. I have colleagues with a lot of career and experience. Yes, Kubernetes and Fluxcd, GitOps, something basic to explain how it works and how it is composed, package it with Helm, and finally deployment automation using flux. Yes, me. At the age of 27, I acquired teaching experience by teaching music, since I was 19 more or less. And if something left me and continues to leave me teaching, it is a path of constant learning. One does not simply "finish learning to teach", it is worth the beautiful tangle of words. In my head, it was generated as a paradox because on the one hand. I thought from what place someone who is just entering, who is learning, who is adjusting to a new world, can come to teach anything to people who have a lot of knowledge and experience. But I also argue that anyone can teach and anyone can learn because it has nothing to do with the level of expertise, but rather with contributing. Having to teach colleagues who have many years of experience, people who are ultimately my reference throughout this journey is at first a beautiful internal mess of uncertainty and insecurities. So with all that, we started the workshop: “What if I start it like this, if not, that I'm going this way, but I don't know and see what command you are going to throw in the terminal, check it and re-check it 30 a thousand times, that you notice how you express yourself ... and be careful, and also, that something does not fail you there live…. The hecatomb! " And so I could continue writing many, many more questions. However, little by little I made more room for the interesting to contribute and help. Not only for contributing and teaching per se, beyond insecurities, but it is also something that is good. If not, because if there is something positive that teaching co-workers has, it is this feeling of being included in a world that at first is a little alien to you, but that gradually ends up incorporating you, it begins to give you a meaning of belonging. As the days of the workshop went by, I assimilated it more and more, and little by little I was releasing loads of insecurity (and "nervous earthquake") because I also realized that we are all learning, starting with me, that I am just starting, even those who have already been in this universe for countless years, which is why it also helped me to break that vice of "dehumanizing" knowledge and also stop idealizing people who have such a high level of knowledge. On a personal level, the most remarkable thing was the last thing I mentioned. I think it is one of the most important obstacles that appear for a junior, even when asking for help for something that may not be working, and even more so when teaching. The fear of "looking (even more) ignorant" ends up weighing more than gaining knowledge based on doubts that can be answered by colleagues with more experience who went through the same thing, or worse. This is also reflected when teaching, underestimating one's knowledge and skills are very common vices that usually appear hand in hand with these insecurities, and even more if we add the fact that I am just taking my first steps. That is why giving this workshop was also like a breakdown of fears and insecurities since there is nothing healthier than facing what puts us to the test. Little by little, one stops recriminating mistakes and ignoring successes to go on to learn and build from mistakes and appreciate successes. When they told me if I wanted to write an article, the first thing I thought about was what to do it, if the topic to work on in the workshop, or what I “learned about teaching”. I think that on the web there is a lot of technical sections of Yamls and container orchestrators, of architecture design, based on AWS and containers, Kubernetes clusters Amazon EC2 compute instances and runs containers on those instances with deployment, maintenance, and scaling processes. So I preferred to elaborate on how my experience was, about how I felt so that I can serve other people who are just starting out and feel that at times they are a little frustrated, or that what they can contribute does not have the same validity as a colleague with more experience. In short, teaching in this workshop left me with certain important points: Don't belittle your skills and knowledge. Teaching is independent of the amount of knowledge and experience you have We all have something we can contribute, and it is important to know how to recognize it. Teaching is also learning, and more between co-workers. Do not be afraid of error. Entering Teracloud was an opening to a world of people with a lot of camaraderies and a great desire to learn. The fact of having been given the possibility of being in a workshop with so little traveled by myself speaks of the way of working and the human simplicity of those who are part, because essentially, rather than people who work, we are human, and making mistakes is part of this, and accept the error and build from there, even more. Leandro Mansilla DevOps Engineer Teracloud

bottom of page