Skip to main content

Part V: Designing Your Schema Chapter 11: Creating and Altering Tables

 

Part V: Designing Your Schema

Chapter 11: Creating and Altering Tables



Building a robust database schema starts with thoughtfully created tables and evolves with carefully applied modifications. In this chapter, you’ll learn how to:

  • Define tables using CREATE TABLE

  • Choose appropriate data types and sizes for each column

  • Enforce data-quality rules through constraints (NOT NULL, UNIQUE, CHECK)

  • Evolve schemas safely with ALTER TABLE operations

By following these best practices, you’ll lay the foundation for reliable, scalable databases that adapt to changing business needs.

1. Defining Tables with CREATE TABLE

1.1 Core Syntax

The CREATE TABLE statement introduces a new table and its columns to the database. Always specify columns explicitly along with their data types and constraints:

sql
CREATE TABLE table_name (
  column1_name data_type [constraints],
  column2_name data_type [constraints],
  …,
  columnN_name data_type [constraints]
);
  • table_name: your chosen table identifier

  • column_name: clear, descriptive names

  • data_type: storage type, size, and precision

  • constraints: rules ensuring data integrity

1.2 Example: Employees Table

sql
CREATE TABLE employees (
  employee_id  SERIAL      PRIMARY KEY,
  first_name   VARCHAR(50) NOT NULL,
  last_name    VARCHAR(50) NOT NULL,
  email        VARCHAR(100) UNIQUE,
  hire_date    DATE        DEFAULT CURRENT_DATE
);
  • SERIAL: auto-incrementing integer (PostgreSQL).

  • PRIMARY KEY: enforces unique, non-null identity.

  • NOT NULL: guarantees presence of first and last names.

  • UNIQUE: prevents duplicate email addresses.

  • DEFAULT CURRENT_DATE: auto-populates hire_date when not specified.

2. Choosing Appropriate Data Types and Sizes

Selecting the right data type balances storage efficiency, performance, and accuracy. Over-allocating wastes disk space; under-allocating risks data truncation.

2.1 Common Data Type Categories

CategoryPurposeExamples
IntegersWhole numbersSMALLINT, INTEGER, BIGINT
Fixed-pointExact decimalsDECIMAL(p, s), NUMERIC(p, s)
Floating-pointApproximate decimalsREAL, DOUBLE PRECISION
TextVariable-length stringsVARCHAR(n), TEXT
TemporalDates and timesDATE, TIME, TIMESTAMP
BooleanTrue/false flagsBOOLEAN

2.2 Guidelines for Sizing

  • Integers: Use SMALLINT when values < 32,767; INTEGER for larger counts; BIGINT for huge sequences.

  • Strings: Allocate VARCHAR length based on realistic maximums. For free-form text, use unbounded TEXT.

  • Decimals: Match precision (p) and scale (s) to business requirements—e.g., DECIMAL(10,2) for currency.

  • Dates/Timestamps: Prefer TIMESTAMP WITH TIME ZONE if you work across regions.

3. Enforcing Rules Through Constraints

Constraints are schema-level validations that prevent invalid data from entering your tables. They shift error detection from application code into the database engine.

3.1 NOT NULL

Disallows missing values:

sql
first_name VARCHAR(50) NOT NULL

3.2 UNIQUE

Guarantees column value uniqueness:

sql
email VARCHAR(100) UNIQUE

3.3 PRIMARY KEY

Combination of NOT NULL and UNIQUE, defining the table’s identity:

sql
PRIMARY KEY (employee_id)

3.4 FOREIGN KEY

Maintains referential integrity between tables:

sql
department_id INT,
CONSTRAINT fk_dept FOREIGN KEY (department_id)
  REFERENCES departments(department_id)
  ON DELETE SET NULL
  • ON DELETE SET NULL: child rows nullified if parent is removed.

  • Alternatively, RESTRICT (prevent deletes) or CASCADE (auto-delete children).

3.5 CHECK

Applies custom rules using expressions:

sql
salary NUMERIC(10,2) CHECK (salary > 0)

Use CHECK to enforce domain-specific logic—age ranges, valid statuses, score boundaries, etc.



4. Evolving Schemas with ALTER TABLE

Real-world requirements change. ALTER TABLE lets you add, modify, or drop columns and constraints without dropping and recreating tables.

4.1 Adding Columns

sql
ALTER TABLE employees
ADD COLUMN phone VARCHAR(20);
  • New column default value: NULL unless DEFAULT specified.

  • Existing rows automatically populated with the default or null.

4.2 Modifying Column Types or Defaults

sql
ALTER TABLE employees
ALTER COLUMN hire_date SET DEFAULT '2025-01-01';

ALTER TABLE employees
ALTER COLUMN email TYPE VARCHAR(150);
  • Changing type may require data migration if incompatible.

  • PostgreSQL supports USING clause for custom casts:

    sql
    ALTER TABLE employees
    ALTER COLUMN salary TYPE DECIMAL(12,2)
    USING salary::DECIMAL(12,2);
    

4.3 Renaming Columns and Tables

sql
ALTER TABLE employees
RENAME COLUMN hire_date TO start_date;

ALTER TABLE employees
RENAME TO staff;
  • Keeps data intact while aligning names with new business terminology.

4.4 Dropping Columns or Constraints

sql
ALTER TABLE employees
DROP COLUMN phone;

ALTER TABLE employees
DROP CONSTRAINT fk_dept;
  • Dropping a column removes all its data—use cautiously.

  • Use CASCADE or RESTRICT to manage dependent objects.

5. Best Practices for Schema Design

  1. Plan Ahead: Map entities and relationships in an ERD before coding.

  2. Use Meaningful Names: Tables and columns should reflect domain concepts.

  3. Enforce Data Quality at the Schema Level: Leverage constraints over application checks.

  4. Minimize Nullable Columns: Favor required fields to reduce ambiguity.

  5. Version Control Your DDL: Store CREATE and ALTER scripts in Git.

  6. Test Schema Changes: Validate in development and staging before production.

  7. Monitor Impact: Use database metrics to gauge performance after schema changes.

Conclusion

Defining and evolving your database schema with CREATE TABLE and ALTER TABLE commands is foundational for any data-driven application. By selecting the right data types, enforcing constraints, and applying incremental schema updates, you build a flexible, maintainable data model that scales with your organization’s needs.

Comments

Popular posts from this blog

Alfred Marshall – The Father of Modern Microeconomics

  Welcome back to the blog! Today we explore the life and legacy of Alfred Marshall (1842–1924) , the British economist who laid the foundations of modern microeconomics . His landmark book, Principles of Economics (1890), introduced core concepts like supply and demand , elasticity , and market equilibrium — ideas that continue to shape how we understand economics today. Who Was Alfred Marshall? Alfred Marshall was a professor at the University of Cambridge and a key figure in the development of neoclassical economics . He believed economics should be rigorous, mathematical, and practical , focusing on real-world issues like prices, wages, and consumer behavior. Marshall also emphasized that economics is ultimately about improving human well-being. Key Contributions 1. Supply and Demand Analysis Marshall was the first to clearly present supply and demand as intersecting curves on a graph. He showed how prices are determined by both what consumers are willing to pay (dem...

Fundamental Analysis Case Study NVIDIA

  Executive summary NVIDIA is analyzed here using the full fundamental framework: balance sheet, income statement, cash flow statement, valuation multiples, sector comparison, sensitivity scenarios, and investment checklist. The company shows exceptional profitability, strong cash generation, conservative liquidity and net cash, and premium valuation multiples justified only if high growth and margin profiles persist. Key investment considerations are growth sustainability in data center and AI, margin durability, geopolitical and supply risks, and valuation sensitivity to execution. The detailed numerical work below uses the exact metrics you provided. Company profile and market context Business model and market position Company NVIDIA Corporation, leader in GPUs, AI accelerators, and related software platforms. Core revenue streams : data center GPUs and systems, gaming GPUs, professional visualization, automotive, software and services. Strategic advantage : GPU architecture, C...

Unlocking South America's Data Potential: Trends, Challenges, and Strategic Opportunities for 2025

  Introduction South America is entering a pivotal phase in its digital and economic transformation. With countries like Brazil, Mexico, and Argentina investing heavily in data infrastructure, analytics, and digital governance, the region presents both challenges and opportunities for professionals working in Business Intelligence (BI), Data Analysis, and IT Project Management. This post explores the key data trends shaping South America in 2025, backed by insights from the World Bank, OECD, and Statista. It’s designed for analysts, project managers, and decision-makers who want to understand the region’s evolving landscape and how to position themselves for impact. 1. Economic Outlook: A Region in Transition According to the World Bank’s Global Economic Prospects 2025 , Latin America is expected to experience slower growth compared to global averages, with GDP expansion constrained by trade tensions and policy uncertainty. Brazil and Mexico remain the largest economies, with proj...