-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Location/naming scheme for MPICH toolchain #42
Comments
P.S., where applicable, I'll keep the dependency on OpenBLAS... assuming the software I use works adequately with it. |
I think it's a good idea, although it essentially duplicates each formula, which complicates maintenance. I had a request recently for MUMPS/MPICH. Is there an elegant way to factor out common code in such duplicated formulae? |
@dpo Do you have a preference between OpenMPI and MPICH? One option is to support only MPICH in this repo. (I have no particular preference myself.) It would be simpler to support only one compiler, one BLAS implementation, and one MPI implementation. |
@sjackman I tend to use OpenMPI because it was the default Homebrew MPI formula at some point (when |
Homebrew core still rejects MPICH dependencies so that you never have
clashing MPI libraries when linking against multiple libs, e.g., scalapack
built with MPICH, while HDF5 built with OMPI.
There are tentative plans for better tap support by inheriting from
formulas in core (perhaps elsewhere) and being able to override
dependencies and args passed to configure/cmake/etc. If/when this comes to
fruition it should reduce the maintenance burden.
Except cases where formulae are relying on features in one implementation
and not the other, the usage of one library over the other should just be a
mater of changing `depends_on “open-mpi”` to `depends_on “mpich”`.
If you’d rather not include these formulae here (where I would take on the
maintenance responsibilities for them) I would be happy to see a separate
repo/tap under brewsci for them, or I’ll borrow the brewsci CI/bottling
setup and implement it in the sorceryinstitute/homebrew-formulae tap.
(Although I would prefer not having to re-engineer/manage that AWS lambda
magic)
…On Thu, Feb 14, 2019 at 7:21 PM Dominique ***@***.***> wrote:
@sjackman <https://github.com/sjackman> I tend to use OpenMPI because it
was the default Homebrew MPI formula at some point (when :mpi still
existed). It also seems to be somewhat more widespread. If we support only
one, I think we should stick with OpenMPI. I think it would be a good thing
to support both, but not at the expense of doubling the maintenance effort.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#42 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAREPHqyjkrotr91ggUM6fjbEMHenKy6ks5vNf2CgaJpZM4a8jeh>
.
|
One option is to maintain both OpenMPI and MPICH in this tap, adding a
The AWS Lambda magic mostly works, and I can set it up for you. I want to migrate to GitHub Actions or Azure in the hopefully near future. |
If you're willing to help with the maintenance, let's go for it! Thanks! |
@dpo: I was discussing adding a toolchain for MPI libraries that is based on MPICH rather than OpenMPI with @sjackman.
I wanted to check with you first as to whether or not it would be acceptable to you to add them as part of brewsci/homebrew-num. The formula would need an additional level of namespacing.
I'm also interested in swapping clang for gcc/g++ but that's not quite as large of a concern at the moment. In theory, at least the C ABI should be static & standardized, allowing code built with one C compiler to be linked against when using a different one. However, build system introspection can cause differences in the way the code ends up getting built and what features are supported. This is probably more true for C++ than C. Also, in the past I've run into issues where MPI will stubbornly wrap the compiler it was built with, or, when it wraps a different compiler, builds end up failing or runtime bugs are encountered.
If you give it the go-ahead, then I'll add packages switching OpenMPI for MPICH (and add a
-mpich
suffix to the file/formula name)The text was updated successfully, but these errors were encountered: