Kick-starting Amazon SES

We have been using Amazon EC2 (Elastic Compute Cloud) as the deployment platform for a client. EC2 is part of the Amazon AWS cloud computing platform.We have problems sending bulk email notifications using Gmail for this particular application. Gmail has restrictions on bulk emailing, and as far as I know, Google does not support emails in bulk, as of now. Google prefers to use Google Groups for bulk emailing.

Anyway, since we were already on AWS and SES was made available this year, we decided to use it.

New – Receive and Process Incoming Email with Amazon SES | AWS News Blog

I’ve been reading through the ‘Getting Started’ and other Development Related Documents, available here .

The basichigh level with SES are:

Signing up for SES. An existing AWS subscriber already has SES enabled.
Registering the email ids from which we want to send emails. This is basically similar to an email verification process. I believe one can register / verify up to 100 email addresses.
Test sending emails. Amazon provides a set of Perl scripts for testing the API from the command line. And it provides SDKs for Java, .Net, Python, Ruby, etc.,.
Applying for production access. Before getting production access is a 2,000 emails per day limit during the testing phase.is
Pricing – It costs $ 0.10 per 1,000 emails, but the first 2,000 emails for an EC2 customer are free. Data or bandwidth cost is separate, which starts at $ 0.12 per 1 GB for the first 10 GBs, and then gradually decreases the GB of use. More details available here .

Sending Limits – 10,000 emails per day. I’m sending emails sent. There are not many bounces or complaints on the emails sent. More details here .

The sending limit can go up to 1,000,000 emails per day. In case of over 1000,000 emails per day, they can contact AWS support directly and present their case. AWS may increase the limits for dry customers.

Sending Rate – Starts at 1 email per second and goes up to 90 emails per second, again.

Usage Statistics – You can not buy statistics from the Amazon other than period. The statistics can be retrieved using the web-based SES dashboard.

AWS.Net SDK – We are using SES from a .Net based app, and integrating the SDK provided by Amazon was quite painless. Code samples are also available on the .a.netbased app, and integrating the SDK provided by Amazon was quite painless. Code samples are also available onthe .Net SDK website

An interesting undocumented feature of the AWS. Net. SDK is, for example, log4net. Log4net for logging on. And it can also be logged on with the new logger named ‘Amazon’. Although, some people have had a more pluggable logging, where one could have a plugged-in a different logging library.AWS.Net SDK is, for example, log4net for log in. Log4net for logging on. And it can also be logged on with the new logger named ‘Amazon’. Although, some people have had a more pluggable logging, where one could have a plugged-in a different logging library.

Trouble-shooting – Amazon SES has been around for the web. The AWS forums is also a good starting point. Most of the issues faced solved on the forums.

Email Authentication – Sender Policy Framework (SPF) and Sender Email authentication mechanisms can easily be used with SES. These mechanisms basically invoked the DNS TXT record which specifies ‘amazonses.com’ as a sending domain. Amazon recommends setting up these records as a minimum.

For those who need to implement DomainKeys Identified Mail (DKIM) authentication, it is not offered by the DKIM specification.

So, that was a brief introduction to Amazon SES. Hopefully, it will save some time for SES.

AWS has been transformed into the dotcom era, are now finally coming true.cloud based offerings, looks like the promises made on “Cloud computing” in the dotcom era, are now finally coming true.

Begin now: AWS Cognito

Most real-world applications need a user login. However, setting up user authentication and management can take considerable time. And it is like reinventing the wheel every time.Here comes to our rescue a managed service from Amazon Web Services: COGNITO USER POOLSAmazon Cognito | AWS Security Blog

What is a Cognito User Pool?

Amazon Cognito User Pool makes it easy for developers to add sign-up and sign-in functionality to web and mobile applications. It serves as your own identity provider to maintain a user directory. It supports user registration and sign-in, as well as provisioning identity tokens for signed-in users.

By using Cognito, you can set up your own user sign in, signup flow with MFA. All this without writing a single line of backend code.

And integrated with AWS ecosystem, it opens up a whole lot of possibilities for front end applications as you can connect with AWS S3, AWS App sync, APIs, Analytics, Push notifications, etc.

AWS also provides an SDK: Amplify in order to connect with some of the AWS services. So all you need to do is call SDK methods from your application and voila! it’s done.

So let’s get started with setting up Cognito.

User Pool Authentication Flow - Amazon Cognito

Initial setup

We will be using the User Pools to setup custom login for our application.

User pool is basically the collection of users on your application. You can also add users in different groups, like for a healthcare application, you can create 2 groups: Patients and Doctors which can be used to allow different actions for both types of users.

Let’s first make a user pool by clicking on “Manage your User Pools”.

We’re gonna walk through this process step by step, so enter the Pool name of “App Users” and click “Step through settings”.

The next step is “Attributes”, where we define the attributes that our “App Users” will have.

Here you can select how do you want your users to sign in: using a username, or an email, or a phone number.

Here we will select login using email as their username.

Next Cognito provides a list of standard user attributes which you can select to add in your user details. You can also add any custom attributes if you want.

Next, on Policies, we can select a password policy for the users. Currently, we have selected only a minimum length of password but you can add more conditions also.

We can also select whether we want to allow users to sign themselves up or only admin can add a user. We will cover the differences in the final application flow later depending on the choice made.

Under MFA and verifications, we can enable multi-factor authentication if needed. And we can also select if we want the users to verify their email in order to confirm their account.

So the last important bit for our application is adding a client application which will be using Cognito in order to authenticate its users. We will set the refresh token to 30 days, which means each login attempt will return a refresh token that we can use for authentication instead of logging in every time. We un-click “Generate Client Secret” because we intend to log into our user pool from the front end instead of the back end (therefore, we cannot keep secrets on the front end because that is insecure)

Click “Create App” and then “Next Step” to move on to–Triggers.

We can trigger different actions in user authentication and setup flow which can be configured here. A simple example could be to send the user a welcome message on signing up.

For the scope of this tutorial, we will not be using any AWS Lambda triggers. Let’s move on to the final step: Review.

  • Here we review all the setup configurations we have made. And hit “Create” to generate the user pool.
  • Take note of the PoolID from the Pool details tab.
  • And the app id from the Apps tab. You will need these in your front end application to connect to this user pool.
  • The last thing left to setup is an identity pool.

Amazon Cognito identity pools (federated identities) enable you to create unique identities for your users and federate them with identity providers. With an identity pool, you can obtain temporary, limited-privilege AWS credentials to access other AWS services. Amazon Cognito identity pools support the following identity providers:

  • Public providers: Login with Amazon (Identity Pools), Facebook (Identity Pools), Google (Identity Pools).
  • Amazon Cognito User Pools
  • Open ID Connect Providers (Identity Pools)
  • SAML Identity Providers (Identity Pools)
  • Developer Authenticated Identities (Identity Pools)

In short, we can say that while User Pool is used for authentication and user registration, identity pools are responsible for authorization to allow different users access to different AWS services by assigning them different IAM roles.

Go to Federated Identities and begin the process to create a new identity pool. Give it an appropriate name.

Now under the Authentication providers section, we will add the Cognito user pool that we just created. Copy the pool id and the app client id. And if we wanted to allow Facebook login or any other login, we just need to add the app id in the respective section and that’s all.

On saving the identity pool, you are redirected to the next screen which creates the IAM roles. Here we can assign different roles to authenticated and unauthenticated users to authorize them to access other AWS services like S3, SNS, etc.

We just need to note one more thing that is used in our front end applications, i.e. the identity pool id which can be found at the below location.

This was all for the Cognito setup. With this our backend system to manage users and authentication is complete. We now have a fully functional User authentication and authorization service with the following features and without any code:

  • Users can sign themselves up
  • User creation by admin with temporary passwords
  • Multi-factor authentication
  • Phone number and email verification
  • Forgot password
  • Social login

From machine intelligence to security and storage, AWS re:Invent opens up new options.

AWS re:Invent Security Recap: Launches, Enhancements, and Takeaways | AWS Security Blog

Technology as an enabler for innovation and process improvement has become the catchword for most companies. Whether it’s artificial intelligence and machine learning, gaining insights from data through better analytics capabilities, or the ability to transfer data and knowledge to the cloud, life sciences companies are looking to achieve greater efficiencies and business effectiveness.

Indeed, that was the theme of my presentation at the AWS re:Invent conference: the ability to innovate faster to bring new therapies to market, and how this is enabled by an as-a-service digital platform. For example, one company that had an increase in global activity needed help to accommodate the growth without compromising its operating standards. Rapid migration to an as-a-service digital platform led to a 23 percent reduction in its on-premises system.

This was my first re:Invent, and it was a real eye opener to attend such a large conference. The week-long AWS re:Invent conference, which took place in November 2018, brought together nearly 55,000 people in several venues in Las Vegas to share the latest developments, trends, and experiences of Amazon Web Services (AWS), its partners and clients.

The conference is intended to be educational, giving attendees insights into technology breakthroughs and developments, and how these are being put into use. Many different industries take part, including life sciences and healthcare, which is where my expertise lies.

re:Invent 2020 Liveblog: Machine Learning Keynote | AWS News Blog

This slickly organized, high-energy conference offered a massive amount of information shared across numerous sessions, but with a number of overarching themes. These included artificial intelligence, machine learning and analytics; serverless environments; and security, to mention just a few. The main objective of the meeting was to help companies get the right tool for the job and to highlight several new features.

During the week, AWS also rolled out new functionalities designed to help organizations manage their technology, information and businesses more seamlessly in an increasingly data-rich world. For the life sciences and healthcare industry — providers, payers and life sciences companies — a priority is being able to gain insights based on actual data so as to make decisions quickly.

re:Invent 2020 Liveblog: Machine Learning Keynote | AWS News Blog

That has been difficult to do in the past because data has existed in silos across the organization. But when you start to connect all the data, it’s clear that a massive amount of knowledge can be leveraged. And that’s critical in an age where precision medicine and specialist drugs have replaced blockbusters.

A growing number of life sciences companies recognize that to connect all this data — across the organization, with partner, and with clients — they need to move to the cloud. As such, cloud, and in particular major services such as AWS, are becoming more mainstream. There’s a growing need for platforms that allow companies to move to cloud services efficiently and effectively without disrupting the business, but at the same time make use of the deeper functionality a cloud service can provide.

Putting tools in the hands of users

AWS Control Tower | AWS Management & Governance Blog

One such functionality that AWS launched this year is Amazon Textract, which automatically extracts text and data from documents and forms. Companies can use that information in a variety of ways, such as doing smart searches or maintaining compliance in document archives. Because many documents have data in them that can’t easily be extracted without manual intervention, many companies don’t bother, given the massive amount of work that would involve. Amazon Textract goes beyond simple optical character recognition (OCR) to also identify the contents of fields in forms and information stored in tables.

Another key capability with advanced cloud platforms is the ability to carry out advanced analytics using machine learning. While many large pharma companies have probably been doing this for a while, the resources needed to invest in that level of analytics has been beyond the scope of most smaller companies. However, leveraging an observational platform and using AWS to provide that as a service puts these capabilities within the reach of life sciences companies of all sizes.

Having access to large amounts of data and advanced analytics enabled by machine learning allows companies to gain better insights across a wide network. For example, sponsors working with multiple contract research organizations want a single view of the performance at the various sites and by the different contract research organizations (CRO). At the moment, that can be disjointed, but by leveraging a portal through an observational platform, it’s possible to see how sites and CROs are performing: Are they hitting the cohort requirements set? Are they on track to meet objectives? Or, is there an issue that needs to be managed?

Security was another important theme at the conference and one that raised many questions. Most companies know theoretically that cloud is secure, but they’re less certain whether what they have in place gives them the right level of security for their business. That can differ depending on what you put in the cloud. In life sciences, if you are putting research and development systems into the cloud, it’s vital that your IT is secure. But with the right combination of cloud capabilities and security functionality, companies can get a more secure site there than they would on-premises.

The conference highlighted multiple new functions and services that help enterprises gain better value from moving to the cloud. These include AWS Control Tower, which allows you to automate the setup of a well-architected, multi-account AWS environment across an organization. Storage was also on the agenda, with discussions about getting the right options for the business. Historically, companies bought storage and kept it on-site. But these storage solutions are expensive to replace, and it’s questionable whether they are the best way forward for companies. During the re:Invent conference, AWS launched its new Glacier Deep Dive storage facility, which allows companies to store seldom-used data much more cost effectively than legacy tape systems, at just $1.01/TB per month. Consider the large amount of historical data that a legacy product will have. In all likelihood, that data won’t be needed very often, but for companies selling or acquiring a product or company, it may be important to have access to that data.

Video on Demand | Implementations | AWS Solutions

One of the interesting things I took from the week away, apart from a Fitbit that nearly exploded with the number of steps I took in a day, was how the focus on cloud has shifted. Now the discussion has turned to: “How do I get more from the cloud, and who can help me get there faster?” rather than: “Is the cloud the right thing for my business?” Conversations held when standing in queues waiting to get into events or onto shuttle buses were largely about what each organization is doing and what the next step in its digital journey would be. This was echoed in the Anteelo booth, where many people wanted more information on how to accelerate their journey. One of the greatest concerns was the lack of internal expertise many companies have, which is why having a partner allows them to get real value and innovation into the business faster.

error: Content is protected !!