Blog | Teracloud
top of page

150 items found for ""

  • INTERNAL WORKSHOP: KUBERNETES

    "It would be good to give an internal workshop on all this that Kubernetes has been working on" Me! a junior! To give a workshop, I couldn’t believe it. I have colleagues with a lot of career and experience. Yes, Kubernetes and Fluxcd, GitOps, something basic to explain how it works and how it is composed, package it with Helm, and finally deployment automation using flux. Yes, me. At the age of 27, I acquired teaching experience by teaching music, since I was 19 more or less. And if something left me and continues to leave me teaching, it is a path of constant learning. One does not simply "finish learning to teach", it is worth the beautiful tangle of words. In my head, it was generated as a paradox because on the one hand. I thought from what place someone who is just entering, who is learning, who is adjusting to a new world, can come to teach anything to people who have a lot of knowledge and experience. But I also argue that anyone can teach and anyone can learn because it has nothing to do with the level of expertise, but rather with contributing. Having to teach colleagues who have many years of experience, people who are ultimately my reference throughout this journey is at first a beautiful internal mess of uncertainty and insecurities. So with all that, we started the workshop: “What if I start it like this, if not, that I'm going this way, but I don't know and see what command you are going to throw in the terminal, check it and re-check it 30 a thousand times, that you notice how you express yourself ... and be careful, and also, that something does not fail you there live…. The hecatomb! " And so I could continue writing many, many more questions. However, little by little I made more room for the interesting to contribute and help. Not only for contributing and teaching per se, beyond insecurities, but it is also something that is good. If not, because if there is something positive that teaching co-workers has, it is this feeling of being included in a world that at first is a little alien to you, but that gradually ends up incorporating you, it begins to give you a meaning of belonging. As the days of the workshop went by, I assimilated it more and more, and little by little I was releasing loads of insecurity (and "nervous earthquake") because I also realized that we are all learning, starting with me, that I am just starting, even those who have already been in this universe for countless years, which is why it also helped me to break that vice of "dehumanizing" knowledge and also stop idealizing people who have such a high level of knowledge. On a personal level, the most remarkable thing was the last thing I mentioned. I think it is one of the most important obstacles that appear for a junior, even when asking for help for something that may not be working, and even more so when teaching. The fear of "looking (even more) ignorant" ends up weighing more than gaining knowledge based on doubts that can be answered by colleagues with more experience who went through the same thing, or worse. This is also reflected when teaching, underestimating one's knowledge and skills are very common vices that usually appear hand in hand with these insecurities, and even more if we add the fact that I am just taking my first steps. That is why giving this workshop was also like a breakdown of fears and insecurities since there is nothing healthier than facing what puts us to the test. Little by little, one stops recriminating mistakes and ignoring successes to go on to learn and build from mistakes and appreciate successes. When they told me if I wanted to write an article, the first thing I thought about was what to do it, if the topic to work on in the workshop, or what I “learned about teaching”. I think that on the web there is a lot of technical sections of Yamls and container orchestrators, of architecture design, based on AWS and containers, Kubernetes clusters Amazon EC2 compute instances and runs containers on those instances with deployment, maintenance, and scaling processes. So I preferred to elaborate on how my experience was, about how I felt so that I can serve other people who are just starting out and feel that at times they are a little frustrated, or that what they can contribute does not have the same validity as a colleague with more experience. In short, teaching in this workshop left me with certain important points: Don't belittle your skills and knowledge. Teaching is independent of the amount of knowledge and experience you have We all have something we can contribute, and it is important to know how to recognize it. Teaching is also learning, and more between co-workers. Do not be afraid of error. Entering Teracloud was an opening to a world of people with a lot of camaraderies and a great desire to learn. The fact of having been given the possibility of being in a workshop with so little traveled by myself speaks of the way of working and the human simplicity of those who are part, because essentially, rather than people who work, we are human, and making mistakes is part of this, and accept the error and build from there, even more. Leandro Mansilla DevOps Engineer Teracloud

  • Connect with an AWS Hero, our Hero

    Do you think you could have the characteristics or qualities that heroes have? Being a hero is an act of courage, it is facing fears, challenges, and possible failures. They are beings with supernatural powers that save people, communities, or entire countries for their courage and dedication to others. The hero, in this sense, usually embodies the most outstanding and valued features of his culture of origin. He has idealized abilities that allow him to accomplish great feats, or heroic acts, which are what make him famous and make him admired by the rest of the community. The contributions it gives to its community are what make it truly valuable and admirable. But, here we are not talking about superheroes from comics ... we are talking about real-life heroes, and this time, we want to tell you how happy and honored we are to have an AWS Hero in our Teracloud team, our CTO and Co-Founder Damian Olguín became part of the history of AWS heroes, for his enthusiasm, experience, and leadership in being a community builder and generating knowledge exchange experiences in LATAM. Let's know a little about the super powers of this Community Hero: Blogger Conference Speaker Meet-up Organizer Streamer Damian is a technological entrepreneur, the passion of this hero for the Cloud world starts from a boy, exactly from high school, where this curiosity to learn to program was born, thus managing to develop projects together with our Director of IA & ML Carlos Barroso in where the idea was to help people with disabilities. This is how he built his approach to different individuals and to different technological projects. As a good leader, one of his characteristics is that he works for objectives despite obstacles, he is persistent, a quality undoubtedly of a superhero; which has led him during his career to participate in various implementation scenarios with Open Source and Proprietary technologies. He has over 15 years of experience in Unix / Linux Server Administration. In addition to this, he has participated in two editions of AWS Re: Invent 2015, 2017 and 2019 in Las Vegas, NV; As if that were not enough, at the last opportunity he was named AWS Community Leader and together with Alejandro Pozzi CEO & Founder of Teracloud, they won the award for the most persistent team. He is an active member of the AWS Spanish Community, participating in user troubleshooting. His incredible superpowers have led him to be the Co-Organizer of the AWS Users Group Córdoba (2017) and DoctArmy (2019), groups for all those interested in cloud computing, where it is not necessary to be a computing genius because the The objective is to share experiences and knowledge about AWS and new technologies. "I wouldn't be who I am from the technical point of view, if it weren't for the community" As you can see, he is a person who is constantly thinking about contributing more knowledge to the cloud world every day, generating new opportunities for learning and discovering talents, thus fulfilling the purpose of being an AWS Hero. “The purpose of the AWS Heroes program is to recognize and honor the most engaged and influential developers who have a significant impact within the community. It also provides Heroes a place to tell their story and connect with like-minded developers ”. | 1 | In this way, the idea was born and he felt the need to create a space to help companies that adopt Cloud technology, going from an On-Premise Architecture to Cloud and in 2016 he decided to apply everything he had learned and give shape to Teracloud, achieving helping customers of all sizes design, project, build, migrate and manage workloads and applications on AWS, in different parts of the United States and Canada, among others. Through CI / CD Automation, excellent security practices, HIPAA, PCI, cost optimization, transformed business with data solutions such as artificial intelligence and machine learning. That's when Teracloud becomes AWS Select Consulting Partners. On the other hand, as a hero has incredible feats, among them are: Take the organization of Community Day in Buenos Aires, Argentina, held in June 2019, where the main opening speaker was Werner Vogels, CTO of Amazon, and where Damián gave the talk on "Automation as code: from zero to hero in minutes", taking advantage of all the experience that he gained in Teracloud during the last years automating and securing the infrastructure for more than 200 clients around the world. Introducing the first track in Spanish at Re: Invent 2019 that took place in December in Las Vegas, Nevada, where he spoke about “How we build CI / CD pipelines in the GitOps”, his interest with this was always to grow communities, unite them and make them stronger. He is also a co-host of #DeepFridays, a Twitch streaming show that promotes the adoption of AI / ML technology by playing with: DeepRacer is a 1/18 scale autonomous race car designed to test RL models by racing on a physical track. Using cameras to view the track and a booster model to control throttle and steering. Stay tuned that the 2020 season of the AWS DeepRacer League is approaching the last round with the Championship in AWS Re: Invent 2020. From November 10 to December 15, by the way, do not miss it! As for DeepLens, it is the world's first camera for developers, fully programmable and DeepComposer is a musical instrument, piano, that allows you to create a melody that transforms into a completely original song in a matter of seconds, all thanks to artificial intelligence. Cool no? So in this way, they are in charge of showing that generative artificial intelligence is one of the latest and greatest advances in artificial intelligence technology thanks to its ability to create something new, right? And last but not least, our hero currently specializes as Cloud Solutions Architect mainly in Amazon Web Services and Google Cloud and is generating a new series of introductory videos on YouTube for all those who are just starting with AWS, it is called AWS 101. In other words, Damian Gitto Olguín is an AWS Hero from head to toe, always seeking the benefit of everyone and giving his best version, to make this world a better place; share their knowledge with all those who feel the same passion and enthusiasm for developing technology projects and working with the entire AWS platform, spending time educating others about a wide range of AWS services, fulfilling their most important mission which is to evangelize the entire AWS ecosystem which does it very well! In conclusion, we are fortunate to have him on the team and we continue to grow as a family and with a vision of heroes. Once again congratulations Damian!

  • Pipeline notifications in a slack channel via AWS bots

    In these times of remote work and global teams, it's not always easy to know what's going on the other side. It could be possible that one team in Europe launches a pipeline just before leaving (while you and the tester's team, in the USA, are sleeping). If that day you forgot to check the pipeline's status, only when the tester's team starts the day you all will be informed about a failure in the pipeline. As DevOps team we are responsible for creating and maintain our CI/CD pipelines, we can't be blocker elements in the SDLC; so, to improve our teams performance we need to anticipate those issues and implement automatic notifications. Accelerate your operations with automation tools. Automation is the way, it helps not only to run the test but also to provision, deploy, set up, and configure the testing and staging environments. With AWS Chatbot you will be able to get notifications of it on your Slack channel. You just have to do this!: Configure your Slack workspace: Create a Slack channel. Install AWS Chatbot (https://www.slack.com/apps/A6L22LZNH-aws-chatbot) and invite it to the channel: /invite Configure chatbot client: Create a new Slack client on the chatbot console (https://aws.amazon.com/chatbot/). Assign permissions to the channel where you want to send the notifications. You can have more than one channel, and report different pipelines in different channels. Configure pipeline notifications: Create a notification for every pipeline. Set the actions you want to notify. In targets, Choose AWS chatbot and the channel where you want to show the notifications. The next time a pipeline runs, all your team will be notified of the pipeline execution result! This way we remove dependencies between members of the staff, ensure everyone is notified of the latest status of the project, and reduce the time of action between the pipeline's end and the staff notifications, allowing faster remediation actions or the testing of the released project. In other words, we introduce agility to our project. Other things we can report by using this chatbot are: Health Check status alarms Budgets notifications GuardDuty / Security Hub findings You can use notifications to help developers to stay informed about the key events happening in their software development life cycle. You can set up notification rules for build projects, deployment applications, pipelines, and repositories, and stay informed about key events such as pull request creation, comments made on your code or commits, build state/phase change, deployment project status change, manual pipelines approval, or pipeline execution status change. When you run your pipeline, you get the expected results. If you need help, do not hesitate to contact us, we will advise you, Accelerate digital innovation! Lourdes Dorado DevOps Engineer Teracloud Sebastián Serantes DevOps Engineer Teracloud

  • Existing Aurora MySQL Cluster: Encryption at rest from zero to KMS

    Have you ever wanted to encrypt an unencrypted Aurora MySQL Cluster with the mínimum downtime? You know you can not create an encrypted replica from an unencrypted Aurora cluster. So I’m going to explain how to encrypt an unencrypted Aurora MySQL database using the binlog replication feature. I will assume that you have a custom DNS record for the database that points to the Aurora cluster endpoint. Well, let’s do it! Enable binlog First, you have to enable BinLogs on the existing Aurora Cluster. Select the cluster parameter group of the Aurora cluster Select the parameter binlog_format Modify the value to ROW Then, reboot the DB instance to apply the change. Create a new Aurora cluster from a snapshot In AWS RDS console, go to Snapshots Select the System tab Select the latest snapshot of the Aurora Cluster. For example: rds:my-aurora-cluster-2020-09-15-05-00 In Actions, select Restore snapshot, and then configure the instance according to your needs but ensure to enable encryption using the default aws/kms key. Wait until the new cluster is ready Configure Binlog replication to migrate the data In the old DB cluster, create a new DB user specifically for replication and grant permissions: mysql> CREATE USER 'repl_user'@'' IDENTIFIED BY ''; mysql> GRANT REPLICATION CLIENT, REPLICATION SLAVE ON *.* TO 'repl_user'@''; In the new DB cluster, enable replication (the filename and position can be found in the Events list of the new DB instance): mysql> CALL mysql.rds_set_external_master ('niceonesa-prod-db-cluster.cluster-cemqzytdtxal.eu-west-1.rds.amazonaws.com', 3306, 'repl_user', '', '', , 0); mysql> CALL mysql.rds_start_replication; Wait until the load is complete and validate that the ongoing replication continues with replication lag = 0 Migration Schedule a maintenance window. Set the site in maintenance mode. Stop the servers to prevent transactions being recorded on the old DB while switching the database Stop the old DB cluster. Stop BinLog replication in the new DB cluster: mysql> CALL mysql.rds_stop_replication; Change DNS record of the cluster DB to point to the new DB cluster endpoint Set the site in production mode. And that’s it. Now you have a fully functional Aurora MySQL cluster with encryption at rest using KMS. TeraTips!

  • The new normal and the cloud world

    2020 has been a year full of uncertainty, many lives and businesses have been affected due to the COVID-19 pandemic. However, this situation has created many challenges for companies, families, and the world in general; for many of us it requires adopting new ways of working, and the word that would define this would be reinvention. We have had to reinvent ourselves in many ways or bring out those skills that we had not had to exploit, both as employees and as a company. In this way, let's take a look at how the Cloud industry has been affected this time in a good way due to the sudden increase in demand and interest in the solutions it provides. Production away from the office and school Businesses and educational institutions around the world have had to make sudden changes, turning many of their services and operations towards digital solutions, making a great effort to stay productive and meet the demands of their staff and their clients. Schools and universities take advantage of video conferencing and online learning platforms to facilitate distance learning, as physical interaction is no longer an acceptable form of communication in light of social distancing efforts. If we see it in a positive light, digitization and the benefits of Cloud Computing have helped many people and companies survive who see digital transformation as a mandatory subject to continue offering their products and services in a globalized world. Before teleworking was seen as a privilege of few, now it has been raised as a real possibility and that it is of great help to increase the productivity of companies. “In the area of ​​Technology, the jobs that will grow in the future will be those related to Data Science. Likewise, everything that has to do with solution architects focused on Cloud Computing, omnichannel strategies, in order to enhance the value proposition of companies towards their clients and create increasingly innovative experiences. On the other hand, the demand will continue for everything that has to do with HR and what is known as e-recruiting, professionals prepared to be able to do job interviews remotely, or the use of technologies, (Learning machines and/or of deep learning), to be able to make interviews much more enriching, simpler, more agile. With regard to the Marketing area, the Big Data and Analytics tools will be used to generate new capabilities in the areas, and new ways of campaigning and attracting customers " (1). Those who have Cloud infrastructure have an important advantage to be able to migrate to a remote model operating without interruption. Optimizing the performance of web applications, and providing the best orchestration tools in the cloud. The Cloud is the most appropriate platform to work remotely, guarantee productivity, develop intense internal and external collaboration in times when real-time information is required to make crucial decisions as quickly as possible, as well as provide continuity and sustainability. to business; ensuring connectivity and access to information. Among its many virtues, the Cloud represents a tool that provides agility to manage a business, facilitates access to applications and data, offers flexibility to make calls, send messages, set up video-conference meetings, collaborate with partners, employees, and clients, from anywhere, place with an Internet connection, using all types of devices and under a single platform with Video streaming services. Streaming and video game request Likewise, the entertainment industry has also been affected and at the same time has had a boost from people who have had to stay at home. The closure of the places we used to frequent before the COVID-19 Pandemic appeared in our lives, such as going to the movies, restaurants, bars, hotels, among others, has caused many of us to turn to alternatives such as Netflix, YouTube, Amazon Prime Video, online cooking classes, marketing or any topic of our interest. So, thanks to Cloud Computing, we have been able to have entertainment in times of confinement and this is something important to maintain optimism and be more positive. In fact, we can find varied entertainment from music streaming, through different video-on-demand platforms, streaming services such as Twitch, online video games, to Streaming platforms for events. among others. On the other hand, video game players have also had an increase in their activity at that time. Attracting age groups “The Global Gaming Study: Impacts of COVID-19 conducted by Simon-Kucher & Partners, a global strategy and marketing consultancy, and Dynata, an independent market research institute, showed that gamers are playing games. more video games (30% growth in players who spend more than five hours a week); they spend more money on video games (39% growth in monthly spending); play new types of games, including those that focus on multiplayer (60% of players play new games); and more video game content via streaming (42% growth in players who watch streaming video games) ”. (2) So gamers also use these types of games as a channel to socialize in social distancing. This means that for the gaming industry in the future it will be necessary to create trends with this focus on social interaction. The e-commerce boom E-commerce is another industry that is on the rise at this time of crisis, as millions of people around the world flock to e-commerce platforms. Industries continue to adapt to the new normal and that is why companies in different industries are beginning to realize the benefits and value of cloud computing, even beyond the need for remote work. This is how hybrid scenarios have been created that facilitate growth and innovation for companies of all sizes, who have been forced to adopt the cloud due to consumer demand. But how does cloud adoption help e-commerce? There is no doubt that working with a cloud platform for e-commerce helps reduce maintenance costs, in this way entrepreneurs in the sector have the opportunity to allocate less human and economic efforts to technology and focus on aspects that promote their own business, such as marketing or SEO. Another advantage is that information can be stored in a more profitable and optimal way since having the data in a virtual infrastructure makes it possible for the web to be always operational and helps reduce risks due to theft, cyberattacks, among others. This is how security comes to the fore in this new normal since in recent times a trend is emerging for cybercriminals to take advantage of the low level of protection of public cloud services in order to install malicious programs, but also to use them as hosting for fake web pages, this can lead to numerous legal and organizational problems. To the point, it is so that it can even give a bad image of the corporate brand. In short, the Cloud world has come to revolutionize the world. Helping us to work in a more comfortable way, from anywhere; offering us advantages of growing as companies and reaching more places without having to invest large sums of money and thus generate profitable business; As buyers, it allows us to have the products or services that we want with just one click and with the total security that our data and information will be safe. For this reason, to enjoy the benefits of the cloud, companies need to commit to an approach that ensures best practices and is supported by a robust, specialized, and reliable methodology. The best cloud service provider will always be important to feel secure, at Teracloud we provide a set of scalable solutions for your company. Solve your most difficult challenges with us. Have a great story to tell. Liliana Medina Community Manager Teracloud If you want to know more about our services, email to our team member ben@teracloud.io #Teracloud #AWSLatam #Cloudcomputing #Covid19 #Cloudsecurity #ecommercecloud #livestreamingplatform #bestorchestrationtoolsinthecloud #reinvention #scalanbility

  • Tips to take AWS Architect- Associate Certification

    If you are preparing to take the Solution Architect - Associate certification keep reading that I have some tips that will help you pass the exam. But first, let's talk a little about why you should take this exam. Amazon Web Services (AWS), is by far the most popular cloud computing service, with more companies and organizations running his infrastructure on AWS the opportunities for cloud professionals are rising. Having the AWS Solution Architect certification, which is the most demanded one, will help you stand out among other applicants. Also, studying for this exam will give you a big picture view of AWS Cloud services, maybe learn about services that you didn't know existed, and will also help you to gain the skills necessary to design system architectures that follow the five pillars of the AWS well-architected framework; Operational Excellence, Security, Reliability, Performance Excellence and Cost Optimization. Well now we are determined to take the exam, where do we start?. Well, the first step should be downloading the Official Study guide for the exam from AWS, which contains the content outline, target audience for the certification exam and brief description of the types of questions and how they are evaluated. The next step will be to start studying for the exam, for this, I recommend using one of the many digital training platforms available, like a Cloud Guru, Cloud Academy, Udemy. For me studying video tutorials is much more bearable than reading lots of documentation, however, there is a white paper that you should read, and this is “AWS Well-Architected Framework”, I also recommend reading the FAQ section of at least the following AWS services EC2, EBS, VPC, S3, IAM, Route 53, CloudFront, Global Accelerator, SQS, SNS, Auto Scaling. OK, are you ready to take the exam? Here are some tips to help you approve the exam. The SAA-CO2 exam has introduced several new AWS services. Be sure the training course that you choose to prepare the exam has been updated with these services. A Cloud Guru covered these new resources. The questions are all scenario-based, read the whole question, and focus on keywords like, “high available”, “cost-effective”, “least operational overhead”. First, remove the obviously incorrect ones and then focus on the possible answers. Be careful that an answer may seem to comply with the question's instruction but if you read carefully a specific word can invalidate the answer. Practice, try to make use of the AWS free tier, and try some basic configurations. Also, many training platforms have the possibility of conducting practical laboratories. Hands-on labs will help you to fix knowledge Give great importance to understanding EC2, VPC, and S3 Know the difference between the different types of load balancers(Application load balancer, Network load balancer, and Classic load balancer). And, also the different routing types for Route 53. Know the difference between the different types of databases(RDS, Aurora, DynamoDB, and Redshift), how to improve performance in each one. Understand how to make solutions highly available by distributing the instances to multiple AZs and using load balancers to distribute workload to the instances. You are ready now. Best of luck taking the exam! Santiago Zurletti DevOps Engineer Teracloud If you want to know more about our services, email to our team member ben@teracloud.io #Teracloud #AWSLatam #TeraTips #Wellarchitecture #AWS #AWSArchitectAssociateCertification #CloudGuru #CloudAcademy #Udemy #AWSservices

  • Scalability in the Cloud and the benefits it brings to your business

    Scalability in the Cloud and the benefits it brings to your business Have you considered the possibility that your business is not fast and efficient enough to cope with growth? Very surely this is one of the biggest concerns when having a company or business, we always want to have the best product or provide the best service and be at the forefront of the needs of our customers. It is important that you know that having virtual systems that operate with models in the cloud will help you to meet your company's growing storage needs and gain a competitive advantage. Companies constantly generate data and there are seasons when it is necessary to have extra storage and infrastructure. For this reason, servers must always be available and capable of scaling as required. Therefore, when building your IT infrastructure, it is recommended to consider Scalability not as an additional resource, but as a necessity. This way, the company can expand its capacity and work without being exposed to downtime or expensive upgrades. So, to be clearer, scalability expands or shrinks data storage capacity to meet growing business demands; meaning that scaling in the cloud provides the best time and money flexibility experience for your business. When business demands increase, you can easily add nodes to increase your storage space, or you can increase the number of servers. When the increase in demand is reduced, you can go back to your original settings. But what are the top benefits of cloud scalability for businesses? • Facilitates performance, so you have the ability to handle the bursts of traffic and heavy workloads that will come with business growth. • You can allow your business to grow without making costly changes to your current configuration. This reduces the cost implications of storage growth, making scalability in the cloud very cost-effective. • Scaling up or out in the cloud is simpler; you can order additional VMs (virtual machines) with just a few clicks, and once payment is processed, additional resources are available without delay. • Scalability ensures that as your business continues to grow, your cloud storage space grows as well. Scalable cloud computing systems adapt to your data growth requirements. With scalability, you don't have to worry about additional capacity needs. As you can see, cloud platforms offer many benefits and one of the most important is scalability, but take things step by step. We are going to take into account the types of scalability in the Cloud: 1. Vertical Scaling: It is an attempt to increase the capacity of a single machine that already exists, adding more processing power by adding more storage, more memory. 2. Horizontal Scaling: It is basically the addition of more machines or the configuration of a cluster or a distributed environment for your software system. So then, the important thing here is to understand the differences between these 2 scaling approaches, identify what suits our requirements, and see if the application really fits the model we choose. Consequently, you must first identify your goals, and see if the requirements can be met by increasing the capacity or adjusting the characteristics of a single machine. If not, go for the scale-out approach or a combination of both. Ultimately, scalability is inherent in Cloud Computing technology. At Teracloud we offer cloud solutions adapted to each business, taking into account the requirements of each company. We make Scaling simple with recommendations that allow you to optimize performance, costs, or balance between them. Your applications always have the right resources at the right time. Is your SaaS platform ready to scale to 1 million users? Reach out to our team to find out where to get started.

  • A brief review of what was the Community Day online 2020

    Community Day events have been held in cities around the world. On August 22, it was held online, due to the situation we are going through worldwide. However, this did not affect this incredible event, where all the community leaders of Latin America come together to present technical debates, workshops, and practical laboratories that bring us closer and closer to the wonders that the AWS Cloud offers. This provides a user-driven experience with AWS experts and industry leaders from around the world. Where we all acquire new and better knowledge through peer learning, this is very enriching, since it not only looks at the technical aspect of the solutions but also the human aspect that other people contribute when solving their challenges and a plus is that you register for free. In the past we had the opportunity to participate in the city of Buenos Aires, where the 2019 edition was held, there we had the honor and luck of being with our CTO Damián Olguín who gave a talk on "Automation as a code: From zero to ECS in minutes"; demonstrating the experience that Teracloud has acquired in automation and infrastructure for more than 200 clients. Keynote presenters for 2020 were Memo Döring (AWS Developer Relations LATAM) who has more than 12 years of experience working for technology companies; Sandy Rodríguez (CEO Certificate at SCRUM) woman leader of the community in Mexico and nothing more than the founder of the Community Ambassadors Cloud; Doris Manrique (Solutions Cloud Engineer, Soluciones Orion) Founder and leader of the AWS Girls Community and passionate about new technologies. The most relevant topics in this Community Day were, containers and Kubernetes, Machine Learning, Serverless with AWS Lambda + API Gateway, among others. The theme of our CTO Damian Olguín this year was "setting up your own Streaming channel with AWS Media Services + Amplify" in which we had an introduction to AWS Media Services, the project that is being carried out with Amplify and its demonstration. In this way, 6 hours of knowledge sharing were developed, directly from the leaders of the user community for free and totally in Spanish. There were also raffles for more than 10 scholarships to render certifications and more than 100 promotional credit coupons. In the same way, we show you the new AWS releases: 1. AWS Controllers for Kubernetes Preview: The AWS Controllers for Kubernetes (ACK) is a new tool that lets you define and use AWS service resources directly from Kubernetes. 2. Amazon Kinesis Data Streams announces two new API features to simplify consuming data from Kinesis Streams. 3. Application and classic load balancers are adding defense-in-depth with the introduction of Desync Mitigation mode: Application Load Balancer (ALB) and Classic Load Balancer (CLB) now support HTTP Desync Mitigation Mode and a new feature that protects your application from issues due to HTTP Desync. As we can see, AWS is constantly innovating and growing to offer a package of services that are necessary to help create sophisticated applications with greater flexibility, scalability, and reliability, which are super important in the DevOps world. At Teracloud as AWS Consulting Partner we love to support customer innovation and we like that the community is connected and we interactively grow and offer improvements to our customers. If you missed it, all the material is available on the official AWS twitch channel https://www.twitch.tv/videos/718196267, and individual sessions will be uploaded to the community Twitch channel https://www.twitch.tv/awscommunitylatam. We look forward to seeing you at an upcoming Latam Community Day celebration with more innovation and new services.

  • What is a Software deployment and what should we take into account when making it?

    In recent years there has been a dramatic change in the market, this has created a digital economy in which companies must take advantage of software to create innovation or face a significant risk of becoming obsolete. IT organizations have to support an increasing number of applications and as these software portfolios change and grow, it becomes increasingly difficult - to plan, build-test - and deliver these applications. Automating the software delivery process is a very good idea, saves time, allows developers to focus on writing code, and creating useful functions; giving them short feedback loops on the quality of their code while and something important is that it costs less. First of all, it is necessary to understand that each company has its own methodology that is the best for them. This is so because each application has its details and peculiarities and therefore, the methodology that works for one can be disastrous for another. This is why it is very important to understand the client's organization as a whole; what the needs are, understand from a technological point of view what it is you want to deliver, what you want to give. For this, in Teracloud we rely on AWS Operational Excellence Pillar (Organization, Prepare, Operate, and Evolve). In this way, we guarantee a successful evolution of operations based on small frequent improvements, which help us to provide a safe environment and give us time to experiment, develop and test improvements, thus we learn from failures. It is important to be clear that the automation effort is not without risks and challenges. Without a thorough and carefully considered implementation plan, launching an application can be a nightmare. We as software implementation specialists evaluate and assemble your applications for all environments, helping you deliver new technology to end-users without the headache, as this has a certain complexity and requires a process that we call Software Delivery Automation. What is this? Deployment automation is what enables software to be deployed in test and production environments with the push of a button. Automation is essential to lower the risk of production deployments. It is also necessary to provide quick feedback on the quality of the software, as it enables teams to conduct comprehensive testing as soon as possible after changes. Our software implementation specialists have cultivated an efficient continuous delivery process that emphasizes extensive and automated testing before integrating or implementing any code. We establish an organized central code repository with comprehensive version control and rollback processes that help us detect code errors and deploy iterations faster. Helping organizations solve the complex problem of continuous delivery of high-quality software and closing the gap between Devs and Ops, version control and deployment automation enable teams to gain control with actionable insights to track versions from start-up to production. Any organization that needs to implement applications quickly and efficiently can benefit from different tools and must follow the Software Development Life Cycle (SDLC) process step by step. This life cycle is divided into 7 phases, as seen in the image. The life cycle approach is used so that users can see and understand what activities are involved in a certain step. We explain a little about each one: 1. Planning: It is the phase where it is identified if a new system is needed to achieve the strategic objectives of a company. It is the preliminary plan, it is to identify the problem and determine the possible solutions. 2. Analysis and systems requirements: In this phase, you work on the source of the problem. Systems analysis is vital to determine what the needs of a company are. 3. Systems design: In this phase, the specifications, characteristics, and operations necessary to meet the functional requirements of the proposed system to be implemented are described. It is where the necessary components (hardware and software), structure (network capacity), processing, and procedures are considered to achieve the objectives. 4. Development: This is where the “real work” begins, this phase means the start of production, and it is the installation and change phase. 5. Integration and testing: The fifth phase determines whether the proposed design meets the initial set of business objectives. The tests are performed to detect errors, thus ensuring the successful completion of the program. 6. Implementation: This phase is where most of the program code is written. Here the program is actually installed. This step puts the project into production by moving the data and components from the old system and placing them on the new system through a direct transition. 7. Operations and maintenance: Here the maintenance and the necessary periodic updates are done. Of course, currently, SDLC is done in an automated way, this is achieved through a solid CI / CD pipeline and is built on the foundation of DevOps. One of the initial restructurings for digital transformation is moving to a DevOps culture, with small dynamic teams and cross-communication. The next stage is technology, which provides an infrastructure that supports rapid development cycles. The way to evaluate a good idea is through thorough and effective testing, not only of the quality of the code but also of the user's experience and preferences. This can only be known through experience. As Kohavi, a distinguished engineer and general manager of Microsoft's experimentation team for artificial intelligence, puts it, "data trumps intuition". Therefore, this is the purpose of continuous delivery and advanced deployment techniques. CI/CD is the platform for rapid deployment; implementation techniques are tools for experimentation and refinement. Behind these two stages is a culture change, which encourages innovation and supports failure and risk. Innovation is not a destination or a single point; it is a process that feeds on experimentation. Being willing to risk failure on the path of innovation requires a culture of humility | 1 |. With Continuous Integration, development changes are constantly compiled and built with each commit, so problems are apparent faster, reducing the feedback loops to the developers to address the fixes. This is usually combined with an automated test suite to verify stability or functionality. This ongoing process of registration, construction, and testing maintain superior code quality. Once the continuous integration path is executed, your application can be deployed, achieving changes in production faster. This speed benefits both developers and operations. Developers and business leaders have the satisfaction of seeing new products go to market faster. This is really what has mobilized automation, “the ability to deliver” and having the advantage of detecting any problems early and being able to act on time. Finally, the life cycle approach of any project is a time-consuming process. Although some steps are more difficult than others, none should be overlooked. An oversight could prevent the entire system from working as planned. As Teracloud DevOps specialists we have extensive experience in managing this type of project. If you have a situation in your organization and you think a custom software solution may be what you need, contact us today. Teracloud consultants will be able to quickly guide you through each of these steps, ensuring you can have your new system online as soon as possible. |1| Teaching an elephant to dance, Intentional evolution across teams, processes, and applications (e-book) redhat.com Damian Gitto Olguin Co-Founder / CTO Teracloud.io If you want to know more about our services, email to our team member ben@teracloud.io #Teracloud #AWSLatam #Wellarchitecture #softwaredeployment #SDLC #DevOps #coontinuousintegration #CICD #plan #integration #test #softwaredeliveryautomation

  • 10 Trends of the Internet of Things (IoT) in 2020

    The Internet of Things (IoT) is establishing new ways of development in the industrial and consumer world. All layers of the business and consumer sectors, from retail to healthcare, have been influenced by smart technologies. It has become an innovation strategy and it would be a mistake not to take advantage of all the benefits that this brings, especially in these times when digital is the stoppage. 2020 is a key year for the 4 components of the IoT model: Sensors, Networks (Communications), Analytics (the cloud), and Applications. The IoT and smart devices are currently helping to improve the performance metrics of leading factories. They are in the hands of employees, facilitating routine management tasks, and shooting productivity rates around 40 and 60% [1]. The following 10 trends investigate the impact of many technologies on the IoT and shed light on the future of the IoT. This is because businesses today want access to more data about their own products and internal systems. This will improve the ability to make changes on time. An example is a fact that many manufacturers insert sensors into their products. This allows them to receive information on its operation and to anticipate when a component may fail. This way they change it before the damage, which increases its reputation and reliability. In conclusion, there are many business opportunities in IoT, but there are several technological challenges, the Internet of Things will allow having a closer relationship with customers, it is the ideal complement to Cloud Computing since it helps the processing to deliver more complete results. If cloud computing transformed the ecosystem of organizations, the IoT is doing the same, forcing them to consider whether they need tools for real-time processing and analysis, and forcing them to choose, better than ever, their sources of information, to guarantee the return on your technology investments. If you need advice, do not hesitate to contact us; Advances in data analysis and visualization make it easy to grow your business with data. At Teracloud we help you improve the decision-making process. From combining machine learning models with advanced prescriptive modeling methods to applying Artificial Intelligence and business optimization. Scale into the future with Teracloud! Reference [1] https://www.linkedin.com/pulse/nine-iot-predictions-2019-ahmed-banafa/ Liliana Medina Community Manager Teracloud If you want to know more about our services, email to our team member ben@teracloud.io #Teracloud #AWSLatam #Wellarchitecture #IoT #automation #artificialintelligence #10trendsIoT #Networks

  • AWS Multi-account architecture

    Have you ever heard the saying ‘never put all the eggs in one basket’? Even today, we stumble upon projects, even recent ones, which are managed with a single AWS account. This triggers The Little Red Light (™) in the brain of any Teraclouder, because of the risks that this decision implies. Why? As you can guess, the main benefits are in the security aspect. Using different accounts for different workloads or applications we create a logical boundary, thus improving the general security. This also minimizes the blast radius in case of a security breach because only the resources in the affected account will be compromised. While the fine-grained access controls provided by AWS IAM make it possible to separate access within AWS accounts, it is easier to do it through the separation of access across accounts. From the billing perspective, using multiple AWS accounts simplifies how you allocate your AWS cost by helping identify which specific business unit (BU), environment, application, or project cost center is responsible for an AWS charge. Tagging resources may allow you to achieve the same, except that not every type of resource supports tagging for detailed billing. You might think that having an invoice on every account is going to be chaotic, but AWS Organizations allow a company to roll up all the invoices from every account into one central bill. So the eggs are in different baskets, but they are still under control. Now, suppose that your company asks you to prepare your environment to get credit card payments. There is a very high chance that you need to comply with heavy regulatory requirements, as dictated by HIPAA or PCI for example. If your whole environment is on a single account, the cost and time needed to adjust it to the regulations can be astronomical. By separating the critical, regulated workloads to a separate account you can isolate these processes and keep the cost at a minimum. Now, suppose your company asks you to prepare your environment to get credit card payments. Having several accounts simplifies compliance requirements. Many organizations have exceptional compliance or privacy considerations. They need to apply to their data and processes, like HIPAA or PCI. By moving workloads that don’t have to be compliant and follow regulations, remove the need for these workloads to participate in the time-consuming and burdensome processes. Enable autonomy between teams may be another good reason to isolate each organizational unit within a dedicated AWS account and offer each group a playground or sandbox to experiment. Best practices There is no single way to design your multi-account architecture; the design will be different for each company depending on its workloads and security restrictions. However, there are common patterns to apply to any company. ● Centralize security logs, like CloudTrail logs, Config snapshots, VPC Flow logs on an AWS account with restricted access. ● Centralize security findings and notifications across all accounts in one place. ● Use a different account to deploy services shared between multiple accounts like Active Directory. And remember: Always automate! Infrastructure automation enables speed through faster execution when configuring your infrastructure, removes the risks of human error, and allows collaboration between teams across the organization. Infrastructure as Code is a must-have in the multi-account scenario. Conclusion With a well-architected multi-account strategy, you will be able to innovate safer, cheaper, and tidier in AWS, and all your eggs will be safe now because you have a plan with contingency actions. Do not hesitate to contact us at Teracloud.io with questions or concerns about multi-account architecture on AWS. We can help you. Santiago Zurletti DevOps Engineer Teracloud If you want to know more about our services, email to our team member ben@teracloud.io #Teracloud #AWSLatam #Wellarchitecture #Multiaccountarchitecture #alwaysautomate #AWSsecurity #flexiblebilling #costingoptions

  • Are the company's systems/information protected under the employee’s home office environments?

    When the COVID-19 crisis suddenly turned us into a confining economy, many were able to keep their jobs thanks to telecommuting. Above security, the priority was to maintain production, and that widened the door to cybercrime, which has been multiplying more and more lately. We are increasingly digitized but less protected. The number of cyber-attacks increased 40% globally, according to data from IBM, the most difficult thing is that the world does not have enough professional experts on the subject. By 2022, an estimated 1.8 million cybersecurity-related jobs will be vacant worldwide. These numbers are explained by the increasing digitization of society: "An increasing part of our lives is online: bank accounts, savings, private data, political opinions ... Everything that can be used to manipulate or harm you, is in the cloud, and if it is not properly protected, you are in the hands of the attacker”, says Fernando Rodríguez, co-founder and Chief Learning Officer of the KeepCoding Programming and Technology Training Center. Companies had to organize remote access to their emergency systems for teleworking and that made them vulnerable. This implies a delicate balance between the need to avoid the serious financial damage of inactivity and the risks of data hijacking (known as ransomware), as well as other leaks of sensitive information. What do organizations do so that workers are not violated beyond the technical measures implemented by companies? It is necessary to train employees in cybersecurity, to cover their webcam in order to avoid possible remote accesses and activate the double factor of authentication in all the platforms on which it is available, as well as think before publishing photos, videos, comments, and other publications. Computer bandits are benefiting from the desperation of the population for the pandemic to achieve their goals. To do this they are creating malicious apps, fake websites, and campaigns via email, text messages, and WhatsApp. Consequently, it becomes essential to create spaces to develop digital education and awareness campaigns. The reasoning is simple: cybercrime exists because there is more and more to steal, it has more and more value, and there is a high probability of going unpunished: “It is a very profitable crime and for which rarely does anyone ever end up in jail. It is done from country to country and it is very difficult to pursue”, explains Santiago Moral, director of the DCNC Sciences Institute of the Rey Juan Carlos University (URJC), in Madrid. Who are the victims of cybercrime? SMEs and private users. Because teleworking is a new modality for many of them, cybercriminals take advantage of exploiting vulnerabilities and attacking the systems of organizations that were not or are not sufficiently prepared to put in place an effective security system. The biggest threats depend on the target. For SMEs, it is about their continuity. There are tremendous volumes of ransomware, as we mentioned earlier, malicious programs that restrict access to certain parts of a computer system, demanding a ransom for their release, this way if your organization is hijacked, it cannot operate. Cyber ​​risks increase the home office. Devices that do not have the necessary protection could cause data loss, privacy violations. Proactive measures can improve the user experience and their security when working under this scheme. So, in the midst of the emergency, companies that currently have employees working from home, need to suggest that they take the following steps: • Install a trusted security solution on all devices that handle corporate data. If your budget is tight, install a free antivirus, as even this will significantly reduce the risk of getting infected and having problems with the company. • Update the software, since, in the latest versions of the programs, vulnerabilities are fixed with patches. So it is important that you update everything you have installed on any device you use for work. • It is useless to protect your computer if the attacker connects to your wifi or if it infiltrates your router, make sure that the connection is encrypted. The one that suits you best is WPA2. • Now change the login credentials and password to access the router settings, if you have not done so before. The default passwords for some router models are not only very weak, but they are also available on the Internet and easy to find. • Surely your company has contracted a series of computer services for employees to use, such as Microsoft Office 365; a messaging platform like Slack or HipChat, or at least a corporate email service. Therefore, it uses corporate resources for the exchange of documents and other information. • If someone urgently needs an important document or demands the immediate payment of an invoice, check that they are who they say they are. Don't be afraid to call the other parties involved to clarify or confirm this order once again with your boss. Especially suspicious of email messages with links. • One of the measures may also be to adopt the AWS WorkSpaces service, this is a secure and managed desktop as a service (DaaS) solution. With Amazon WorkSpaces, your users get the desktop of their choice, fast and responsive, which they can access from anywhere, anytime, and from any compatible device. You can pay for the WorkSpaces you implement per month or per hour, saving you money compared to traditional desktops and local VDI solutions. Finally, people must understand that even if they are outside the office, the device from which they work is a door to the entire organization, so they must guarantee proper use. With a large number of professionals working from home, connecting to the Internet from a home network, probably poorly protected, and with outdated software, each of those employees is a potential target. If you have a small or medium business and feel that your information and that of your clients is at risk, do not hesitate to contact us. We can help you solve security problems and provide you with cloud services that will promote the growth and support of your company. We have a free assessment and we are experts in protecting information and ensuring that your business has the necessary tools so that you can offer products and services in complete safety.

bottom of page