Have a question?
Message sent Close

Namibia

Noun

A country in southern Africa known for its desert terrain and Atlantic coastline.

Look Up Additional ASL Signs

← Back to ASL Dictionary
0
    0
    Your Cart
    Your cart is emptyReturn to Shop