After updating your system, perhaps the next important thing to do is configure SSH access to the system. You won’t be operating the system right on the terminal most of the time, and the most secure way to access the headless system is via SSH.
(1) Log into the system as you would normally from a remote machine. You will be asked to provide a password.
(2) Configure your SSH resources (files, directories, permissions)
[root@machine ~]# mkdir -p .ssh
[root@machine ~]# touch .ssh/authorized_keys
[root@machine ~]# chmod g-w .
[root@machine ~]# chmod 700 .ssh/
[root@machine ~]# chmod 600 .ssh/authorized_keys
[root@machine ~]# exit
(3) Publish your SSH public key to the system (this was done from OS X, and it assumes you have already generated an SSH key pair to the default location in your home directory).
$ cat /Users/username/.ssh/id_rsa.pub | ssh root@machine 'cat >> .ssh/authorized_keys'
Logins from your system to the server should now not require a password. Your SSH keys will be used to authenticate you on the remote machine.
If you have a firewall, you might want to ensure that port 22 (default SSH port) is configured appropriately. Because my DevOps is not open to the world, that port is closed on my firewall. That means I can only access the machine as long as I am on the internal network. That might change if the need to SSH in away from home becomes paramount.
After installing CentOS7, the first thing you should do is update currently installed software, including the kernel itself. This all can be done using yum.
# yum update
Then of course you should reboot before setting about configuring the system.
My DevOps machine currently runs Windows Server 2012 R2, but I have made the decision to have my backend machines run Linux. At the very least, this move forces me to master the Linux operating systems (CentOS in this case), but I also buy into the idea that Linux servers are more efficient than Windows servers generally. Besides, I am running lean DevOps and would like to avoid the expense of Microsoft product licensing.
And thus I install CentOS 7. I downloaded the ISO and created a bootable DVD for use.
- Insert the bootable DVD into the computer’s media player and restart the computer. If it does not automatically boot to DVD, change the first boot device in BIOS to CD/DVD.
- When the DVD is picked up for boot:
- Check this media and install CentOS 7 [Next]
- Language for installation = English [Continue]
- Date and Time = Americas/Denver, Keyboard = English (US), Language = English (United States)
- Software selection
- Basic Web Server (Backup Client, Debugging Tools, Directory Client, Hardware Monitoring Tools, Load Balancer, Network File System Client, Performance Tools, Remote Management for Linux, Compatibility Libraries, Development Tools, Security Tools).
- Installation Destination (ATA WDC WD10EZEX-08M = 931.51GB) sda
- Choose to configure partitioning yourself.
- Manual partitioning:
- Select to create partitions automatically, then make adjustments:
- /data (300GB) = just data, separating applications and their data.
- /apps (200GB) = specific applications (services)
- /boot (500MB) sda1 = boot partition
- / (423GB) for everything else.
- swap (8GB) = since the system has 8GB of memory.
- [Done] A warning/confirmation will be presented, indicating existing partitions will be deleted, and new ones created and formatted.
- Network and Hostname: Ethernet (enp3s0) connected to use DHCP (gets 192.168.1.110 because I’ve configured my router to always give this MAC address that IP address), hostname = workhorse.
- [Begin Installation] and setup the root password. Installation proceeds and the system reboots.
After installation and reboot, you will see a boot screen: “CentOS Linux 7 (Core), with Linux 3.10.0-229-el7.x86_64”, along with a *rescue* option. Select the first.
At a minimum, the home business practicing DevOps needs a server to host the various tools and possibly virtual machines that will be used in the operation. This machine would ideally be a top of the line server with the latest processor, gobs of memory, and terabytes of storage, but my economy requires that I make the best of what I already have.
On hand is a box I built 6 years ago based on a Gigabyte GA-H67MA-UD2H-B3 motherboard hosting an Intel H67 Express chipset. The only purchases I am making to soup up the machine are a pair of hard drives and additional memory. Its complete specs are thus:
- Intel dual-core i5 CPU clocked at 3.2GHz. I know it is end of life at this point, but it still works.
- Gigabyte GA-H67MA-UD2H-B3 motherboard with Intel H67 Express chipset.
- 8GB of DDR3 SDRAM clocked at 1333MHz.
- 2TB total storage on a couple of WD Blue 1TB Desktop Hard Disk Drive – 7200 RPM SATA 6 Gb/s 64MB Cache 3.5 Inch – WD10EZEX. This Gigabyte motherboard support XHD so I plan to enable it and see what good it is. I know I might regret this decision down the road.
- Realtek RTL8111E network card (10/100/1000 Mbit).
- VGA monitor (although it will run headless for the most part).
- USB keyboard.
- USB DVD drive.
This server is intended to host development services such as FTP, DNS, web services, automated build tools and a pre-production VM. Down the road, I will need to purchase an additional server to host more complex configurations of test bays and attached RAID storage.
With the new year comes a new resolution for my software development business: to do DevOps right! This means that it is time to follow industry best practices for developing and delivering software, from the implements of the trade to the latest technologies for implementing software. The plan is to take the hardware I have and an existing application and “upgrade” them using modern DevOps techniques. By December this year, I’ll have a world-class DevOps right here in my home office.
Another goal in this resolution is to achieve modern DevOps on the cheap, therefore open-source software will be my harbinger. I will however catalogue how much time I expended on this effort and attempt to determine what the real cost of such operations can amount to in today’s market. I also intend to pick up a host of new skills while at it. Finally, I want to learn about the business of software, how to market or license your products for profit.
As you get more serious about Java EE 7 development, the logical first step is to upgrade to Java 1.7 everywhere. You must decide between the 32-bit and 64-bit versions, with an eye towards what operating system you have and what IDE and tools you will be using. The 32-bit version is a safer bet if you still use older tools, but I’d like to be 64-bit everywhere.
Download the JDK here: http://www.oracle.com/technetwork/java/javase/downloads/index.html. Install it, and set its /bin in the PATH environment variable. Also set the JAVA_HOME to this installation’s root directory, and the JRE_HOME to its /jre directory (which is the server JRE). You might have also installed a public JRE; note that this is a client JRE used by web browsers and HotSpot.
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)
If you want to use the GlassFish application server in your development, you should also download the Java EE 7 SDK (without JDK) from http://www.oracle.com/technetwork/java/javaee/downloads/index.html. When you install it, you might get this error:
This application needs version 1.6 or higher of the Java 2 Runitme Environment.
If the required Java 2 Runtime Environment is not installed, you can download it from the following website: http://java.sun.com/j2se.
Or if you already have the required Java 2 Runtime Environment installed, try rerunning this application with the following usage:
"java_ee_sdk-7-windows.exe -j <JAVA installation directory>"
Never mind that I already have JAVA_HOME set in environment variables, or that calling “java” on the commandline is successful, this error does not make sense at all. But install with the commandline suggestion.
Buying an SSL certificate should not be an overly emotional event. The easiest approach is to understand what the certificate will do for you: (1) encrypt your session, and (2) deliver a level of trust to clients.
The most important considerations should focus on how well the certificate protects your data (is it 128- versus 256-bit encryption) and what will be protected (single domains versus subdomains or multiple domains). I almost always choose 256-bit wildcard certificates. They are a little more expensive per annum, but they provide better flexibility should things change on your domain.
The second thing you should consider is the kind of certificate:
* Domain validation (DV) = validate just the domain registration is owned by the applicant.
* Organization validation (OV) = validate business information as well.
* Extended validation (EV) = validate business extensively, not usually for personal use.
I almost always go with a DV SSL for most purposes, the default for many SSL purchases. You might see a whole lot more features offered as part of your certificate, but I haven’t found much use for them, except to give the client additional trust in your website. There is insurance or warranty levels, multi-year deals, etc.
But of particular interest are mobile-compatible certificates, although none of the SSLs I’ve used to date have failed to work on mobile devices. Also, the so-called “green bar” certificates which offer the highest level of trust, seem unnecessary to me.
So for most of us, a 256-bit DV wildcard single-year SSL certificate gets the job done. Renewals are usually sold at discount. Most vendors also offer graphics and realtime validation of the certificate when a protected page is loaded. A great perk, but you shouldn’t have to pay additionally for it.
Webapps run on the Internet over HTTP, which relies on TCP/IP. But TCP/IP is inherently insecure. To secure IP, Internet Protocol Security (IPSec) is the answer, for encrypting network communications at the Internet layer (Internet Protocol suite).
A complete solution however also requires a firewall to control access, and vigilant patching of operating systems and dependent applications/libraries.
IPSec uses two separate protocols to secure data transmission:
(1) Authentication Header (AH) for authentication and integrity verifcation of packets, and
(2) Encapsulating Security Payload (ESP) for encryption services.
IPSec helps guard against attacks such as:
- Eavesdropping (reading data in transit)
- Address spoofing (rerouting data in transit)
- Man-in-the-middle attacks (modifying data in transit)
- Denial of service (DOS) attacks = flooding a system with unnecessary traffic, overwhelming it so it is unresponsive.
- Sniffer attacks.
Independent contractors, especially web developers, have most likely experienced this situation: somebody sends you a one-paragraph description of a webapp they’d like and ask you how much it will cost, and when you can deliver it.
An application called SKIVI, designed to help independent contractors run their
businesses. Users can self-register accounts for their business, and when logged in,
can manage their contact and login information. This first module should also be
secure and look nice, using the latest web technologies.
That is a typical description, and in fact a real one. You really cannot answer any of the questions yet. In fact, the first question to answer is whether you can do such a project, whether you have the skills and time for it [YES]. And before you answer any further questions, you need a more specific description similar to what I described in http://blog.strive-ltd.com/2012/11/how-webapps-are-made-pt-2/.
So I’ll expand the given description and develop a full-blown application in PHP using the Symfony2 framework using the NetBeans IDE. Along the way, I’ll answer those initial questions.