This mup.js (using kadirahq meteor-up) tries to load balance incoming traffic to a example.com to 2 running services of the same App, each service is running on a different AWS EC2.
But when I shut down the first ec2 service, then reload the page, I get page can't be found instead of the expected behaviour of being routed to the other running "second ec2 sevice".
Any idea what is wrong with my mup.js file please.
module.exports = {
servers: {
one: {
host: 'first ec2 public ip',
username: 'ubuntu',
pem: "../path to key.pem",
env: {
CLUSTER_BALANCER_URL: 'http://one.example.com'
}
},
two: {
host: 'second ec2 public ip',
username: 'ubuntu',
pem: "../path to key.pem",
env: {
CLUSTER_BALANCER_URL: 'http://two.example.com'
}
}
},
meteor: {
name: 'example',
path: '../../example',
servers: {
one: {},
two: {}
},
buildOptions: {
serverOnly: true
},
env: {
ROOT_URL: 'http://example.com',
MONGO_URL: 'mongodb://url from mongodb host',
CLUSTER_DISCOVERY_URL: 'mongodb://url from mongodb host',
CLUSTER_SERVICE: 'example'
},
docker: {
image: 'abernix/meteord:base'
},
deployCheckWaitTime: 60
}
};
A records on the AWS DNS looks like this
example.com --> first ec2 public ip
one.example.com --> first ec2 public ip
two.example.com --> second ec2 public ip
This mup.js (using kadirahq meteor-up) tries to load balance incoming traffic to a example.com to 2 running services of the same App, each service is running on a different AWS EC2.
But when I shut down the first ec2 service, then reload the page, I get page can't be found instead of the expected behaviour of being routed to the other running "second ec2 sevice".
Any idea what is wrong with my mup.js file please.
A records on the AWS DNS looks like this
example.com --> first ec2 public ip
one.example.com --> first ec2 public ip
two.example.com --> second ec2 public ip