I have been having problems with this for a while and have tried everything I know, so I figured it was finally time to ask for some help.
Any edit I make to /etc/hosts just doesn't work.
Example:
julian@ifrit:~$ cat /etc/hosts
127.0.0.1 localhost
127.0.1.1 ifrit
192.168.1.100 dev.julianfernand.esIn the example above, when I access dev.julianfernand.es (this doesn't exist), it should load from 192.168.1.100.
If I ping, it works just fine. However when I access dev.julianfernand.es using Google Chrome or Firefox, it doesn't.
Now, after I restart a couple times, it works. But since I work at a managed WordPress hosting company, I deal with many situations where I have to edit my file to see the customer's website on our server.
I just can't keep restarting my computer. It isn't productive at all. Restarting the networking service doesn't work, same for clearing cache (even internal Chrome DNS cache).
Does anyone have an idea here? This happens with elementaryOS (based on Ubuntu 12.04) and Ubuntu 13.10 (daily). Haven't tried with any other version yet.
PS: if this matter, I have a NGINX server running on this machine with PHP-FPM and MySQL.
Thanks in advance :)
411 Answers
In Ubuntu if you want to flush DNS cache, you need to restart nscd daemon.
Install nscd using the following command:
sudo apt-get install nscdFlush DNS Cache in Ubuntu Using the following command:
sudo service nscd restartOR
sudo service dns-clean startReference:
2For me the solution was to edit /etc/nsswitch.conf file (you may use command sudo vim /etc/nsswitch.conf). I've changed line:
hosts: files mdns4_minimal [NOTFOUND=return] dnsto:
hosts: dns files mdns4_minimal [NOTFOUND=return]and now it is working as expected!
3The following worked for me: add
addn-hosts=/etc/hostsin
/etc/NetworkManager/dnsmasq.d/hosts.confkill dnsmasq and
service NetworkManager restart 1 The accepted answer works in 12.04 through 13.04 by disabling dnsmasq, but it stopped working for me in 13.10. I found the following new solution for 13.10.
Edit your /etc/default/dnsmasq and change ENABLED=1 to ENABLED=0 and restart.
From:
A new "feature" in Ubuntu 12.04 desktop edition is to use dnsmasq as a plugin to NetworkManager for local DNS. Dnsmasq is intended to speed up DNS and DHCP services but comes with one unfortunate side effect: dnsmasq caches local DNS and ignores changes to /etc/hosts. I make frequent changes to the hosts file while working on websites so this "feature" was quite annoying.
The solution is to disable dnsmasq in the Networkmanager configuration file. Open /etc/NetworkManager/NetworkManager.conf and comment out the line:
dns=dnsmasqMy NetworkManager.conf file contains the following:
[main]
plugins=ifupdown,keyfile
# dns=dnsmasq
[ifupdown]
managed=falseSee also
3It's silly but the browser cache is the 'guilty' in my case = should be obvious for a developer :(
Another silly (obvious) thing what I missed, Browsers (Chrome, Mozilla) perform a google (or search engine) search directly from the address bar, so instead of resolving the address what you gave they search that word on the internet!
Solution:
Once the /etc/hosts file is edited ping the chosen address! If the ping command resolves the IP what you have given the 'hosts' file works!
Clear your browser's cache (browsers cache DNS resolutions for some time (TTL))
Disable the default search engine of the browser you are using
Simple and Updated
- Create
/etc/NetworkManager/dnsmasq.d/hosts.conf. - Put lines like
address=/whatever/1.2.3.4in it. See the docs(look for--address). Wildcards are possible:address/.whatever./1.2.3.4. - Kill
dnsmasq(bug). - Restart it:
$ service network-manager restart.
Edit /etc/nsswitch.conf, comment out below line by adding # in front of the line
hosts: files mdns4_minimal [NOTFOUND=return] dns myhostnameand add
hosts: files mdns4_minimal [NOTFOUND=return] dnsBasically, configuration is reordered. Now domain lookup process will consult file first which is /etc/hosts and then it will consult DNS. With default configuration it consults DNS first before any other appropriate services or files.
For quick test whether its working or not you can use
sudo python -m SimpleHTTPServer 80to create a simple HTTP Server to serve files from directory and then comment below line in /etc/hosts file
127.0.0.1 localhost
127.0.1.1 01hw730983and add
127.0.0.1 contentthen go to a browser and type content/, if you able to see Directory structure its working else its not.
It sounds like your setup is a lot like mine. I have a box running Ubuntu as my office server and development host. On that box, I run Nginx, Apache, Tomcat, Rails apps and whatever else I need.
From my Mac, I can simply add a hosts entry and load the server with whatever name I need, but from a ElementaryOS client, that doesn't work.
I tried the fixes above, with no success.
What I did was to run Squid Proxy Server and add the hostnames to /etc/hosts on the server. (Editing requires a restart of squid.)
After that, make the appropriate proxy settings in your browser or OS's control panel.
Check the permissions on /etc/hosts file.
On my cloud service after cloning a server the permissions on hosts file changed from 644 to 600 so the file could not be read by apache (www-data) I guess. I ran sudo chmod 644 hosts from /etc and that fixed it.
The problem started out as:
MongoConnectionException Failed to connect to: localhost:27017: Previous connection attempts failed, server blacklisted. I tracked it down to the server variable in MongoClient pointing at localhost. I was unable to ping localhost or the hostname.
1These days (2021) Firefox is using DNS over HTTPS which you have to disable for it to respect your hosts file. I think this has some privacy implications, so get informed before you do it.
Go to Firefox settings, search DNS. Click settings next to Configure how firefox connects to the internet. At the bottom, uncheck Enable DNS over HTTPS