Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
home_server:home_server_setup:network_setup [2024-12-23 Mon wk52 07:51] – [Full Network Setup] baumkp | home_server:home_server_setup:network_setup [2025-01-03 Fri wk01 11:00] (current) – [References] baumkp | ||
---|---|---|---|
Line 1: | Line 1: | ||
{{tag> | {{tag> | ||
- | =====Network Setup===== | + | ======Network Setup====== |
+ | Most server have more than one network connection although one is technically enough. | ||
+ | |||
+ | It would seem that Debian Linux supports multiple methods to define network connections: | ||
+ | - ''/ | ||
+ | - network manager | ||
+ | - systemd-networkd | ||
+ | - netplan | ||
+ | As usual they all have their own pros and cons. Also care needs to be taken not to have conflicting methods operating at the same time, particularly on the same interface. | ||
+ | |||
+ | ++++old, tldr;| | ||
The home server I have has 4 Intel Gigabit NICs. The past couple of years I have only been using 1 NIC to a main 24 port gigabit switch. This is described in the Basic Network Setup below. The newer home server has 5 drive, 2 SSD system drive, 2 larger data storage drives and 1 drive used as a parity drive for off line raid for the data storage drives. For most the time a single NIC will provide sufficient bandwidth between the server and switch. However there exists server capacity to saturate bandwidth of a single gigabit NIC. To increase effective bandwidth there is an option to bond 2 or more NIC together to combine their bandwidth. This is call NIC Bonding. To allow virtual machine NIC access the NIC(s) must be setup in bridge mode. Furthermore bridging NICs can also allow the NICs to act as a switch, obviously where more than one NIC is available. The Full Network Setup section below, describes setting up the system with bonded and bridge NICs. All these described setups were found to operate well. | The home server I have has 4 Intel Gigabit NICs. The past couple of years I have only been using 1 NIC to a main 24 port gigabit switch. This is described in the Basic Network Setup below. The newer home server has 5 drive, 2 SSD system drive, 2 larger data storage drives and 1 drive used as a parity drive for off line raid for the data storage drives. For most the time a single NIC will provide sufficient bandwidth between the server and switch. However there exists server capacity to saturate bandwidth of a single gigabit NIC. To increase effective bandwidth there is an option to bond 2 or more NIC together to combine their bandwidth. This is call NIC Bonding. To allow virtual machine NIC access the NIC(s) must be setup in bridge mode. Furthermore bridging NICs can also allow the NICs to act as a switch, obviously where more than one NIC is available. The Full Network Setup section below, describes setting up the system with bonded and bridge NICs. All these described setups were found to operate well. | ||
I added a 2.5gbe NIC card to my servers and switch. | I added a 2.5gbe NIC card to my servers and switch. | ||
- | Some references are noted below Network Setup Links. | + | Some references are noted below Network Setup Links. |
+ | |||
+ | =====References===== | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
+ | *[[https:// | ||
=====Archived Network Setups===== | =====Archived Network Setups===== | ||