Skip to content

Deployed vs AWS Lambda

Deployed.cc is an open-source no-ops platform to deploy web projects on virtually any cloud server directly from your git repository. You can think about Deployed.cc as Lambda on your own servers with zero dependencies. Let me show why Deployed.cc can be the best alternative for AWS Lambda and other serverless computing services. We’ll compare:

  • Complexity
  • Pricing
  • Local tests
  • Continuous Deployment

Complexity

Let’s check how to deploy a simple Node.js project with a single GET request using Deployed.cc and AWS Lambda. Imagine that you already have a Node.js project on your local machine, registered AWS and Deployed.cc accounts

Deployed.cc AWS Lambda
- Create a Node.js project at deployed.cc/me
- Connect project’s GitHub, Bitbucket or GitLab repository
- Push your local code to the remote git repository
- Create a Lambda project on your AWS account
- Install AWS CLI & configure credentials
- Install additional “serverless” npm package
- Configure serverless.yml
- Install “serverless-offline” npm package for testing on your local machine
- Deploy the function using sls CLI, not Git

With Deployed.cc you shouldn’t create any configuration files, adapt your code to run it and configure any credentials. Deployments are handled over Git. It means that you have one entry point for your code and deployments. With Lambda you should create specific functions and adapt them to AWS Lambda limits. Also, you should use a separate CLI to get functions online.

Pricing

AWS provides 400,000 GB-seconds of computing time every month for free. “It’s awesome”, you tell. But I don’t think so and let’s check why. AWS tries to take your money for each possible byte and millisecond. You upload your code and pay for it, your user downloads a file and you pay for it again, etc. When you use Lambda you should pay for additional services too. For example, if your Lambda function is reading and writing data to or from AWS S3, you will pay for the requests and data stored on Amazon S3. If you set up continuous deployment for your Lambda functions you pay for S3 storage too. The main problem here is that you don’t know how much will you pay for the current month. Imagine, that someone decides to attack your website and your website starts receiving a million requests per hour and downloading gigabytes of data. In this case, you’ll lose thousands of dollars within a day.

With Deployed.cc all costs are predictive. You pay per server not per requests. Yes, cloud providers like DigitalOcean, Vultr, Hetzner haven’t a free tier but the cheapest server (1 GB memory and 1 CPU) costs $4-5/month and it can handle hundreds of millions of requests each month (with AWS Lambda it will cost $100 and up).

Testing locally

When you’re working on API it’s important to create an environment on your local machine similar to the environment on your servers. Let’s check how to set up a local test environment with Deployed.cc and AWS Lambda

Deployed.cc AWS Lambda
- Just run your project using “node index.js” or similar command, nothing new to install here - Install “serverless-offline” npm package
- Run your project using sls CLI, not standard Node.js commands

With Lambda functions you should install an additional NPM package to run functions on your local machine. If you use Deployed.cc you shouldn’t install anything - just run projects using standard Node.js commands.

Continuous deployment

If you haven’t tried to set up continuous deployment for your Lambda functions before, just check this long official AWS tutorial - https://docs.aws.amazon.com/lambda/latest/dg/applications-tutorial.html . Using Deployed.cc you get continuous deployment by default - each time you push your code to the remote git repository it’s deployed automatically on your own servers in a separate container. Each git branch has a separate URL and you have full access to the project’s logs.

Feel free to contact us if you have any ideas on how we can make our platform even better.