Definition: West coast - a region with an extensive coastline, often including coastal cities and towns along its eastern edge. Definition: the part of the United States east of the Mississippi River; sometimes also includes parts in Canada, which are known as the "east" or "west" side of the river depending on your geographical location.