In: Computer Science
Server-Side Attacks
In today’s globally connected cyber community, network and OS level attacks are well defended through the proper deployment of technical security controls such as, firewalls, IDS, Data Loss Prevention, Endpoint and security. However, web servers are accessible from anywhere on the web, making them vulnerable to attack.
1. What is the process called that cleans and scrubs user input in order to prevent it from exploiting security holes by proactively modifying user input.
2. Name the process that tests user and application-supplied input. The process is designed to prevent malformed data from entering a data information system by verifying user input meets a specific set of criteria (i.e. a string that does not contain standalone single quotation marks).
3. Secure SDLC is the process of ensuring security is built into web applications throughout the entire software development life cycle. Name three reasons why organization might fail at producing secure web applications.
4. How might an attacker exploit the robots.txt file on a web server?
5. What steps can an organization take to obscure or obfuscate their contact information on domain registry web sites?
6. True or False: As a network defender, Client-Side validation is preferred over Server-Side validation because it's easier to defend against attacks.
1) Input sanitation is the process that cleans and scrubs user input in order to prevent it from exploiting security holes by proactively modifying user input. This can be used to mitigate against the dangers of SQL injection, cross-site scripting (XSS), and remote file inclusion (RFI).
2) Input validation is the process that tests user and application-supplied input. The process is designed to prevent malformed data from entering a data information system by verifying user input meets a specific set of criteria (i.e. a string that does not contain standalone single quotation mark).Input validation prevents improperly formed data from entering an information system and should be performed whenever data is being taken for external parties, especially if they are untrusted sources
3) Cost, Incompetence of Employees and Failure in Security systems being used are the 3 reasons why organizations might fail at producing secure web applications.
4) Most websites provide a robots.txt file.This manages access controls about which pages/files are available to search engine crawlers. This information can be exploited to find the content management system and the structure of the directories.
5) Organizations can use Proxy service or private domain registration to obfuscate information.Typically when you register a domain name, your contact information is publicly available. Private Registration offers a service to domain name registrants to protect their personal information from being displayed in the public Whois lookup . You control who reaches you and when
6) False. Client-side validation can be manipulated and bypassed more easily. For eg: JavaScript powered validation can be turned off in the user's browser, fail due to a scripting error, or be maliciously circumvented without much effort.It is possible to create fake user agents with a custom application that creates an HTTP request with arbitrary headers and content. It can even say it is a real browser. By validating on the server-side you ensure any client-side restrictions that were by-passed are validated again before being stored or reflected back to the user.